Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning
The purpose of this paper is twofold. On a technical side, we propose an extension of the Hausdorff distance from metric spaces to spaces equipped with asymmetric distance measures. Specifically, we focus on extending it to the family of Bregman divergences, which includes the popular Kullback–Leibl...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-05-01
|
| Series: | Machine Learning and Knowledge Extraction |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2504-4990/7/2/48 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849432006593609728 |
|---|---|
| author | Tuyen Pham Hana Dal Poz Kouřimská Hubert Wagner |
| author_facet | Tuyen Pham Hana Dal Poz Kouřimská Hubert Wagner |
| author_sort | Tuyen Pham |
| collection | DOAJ |
| description | The purpose of this paper is twofold. On a technical side, we propose an extension of the Hausdorff distance from metric spaces to spaces equipped with asymmetric distance measures. Specifically, we focus on extending it to the family of Bregman divergences, which includes the popular Kullback–Leibler divergence (also known as relative entropy). The resulting dissimilarity measure is called a Bregman–Hausdorff divergence and compares two collections of vectors—without assuming any pairing or alignment between their elements. We propose new algorithms for computing Bregman–Hausdorff divergences based on a recently developed Kd-tree data structure for nearest neighbor search with respect to Bregman divergences. The algorithms are surprisingly efficient even for large inputs with hundreds of dimensions. As a benchmark, we use the new divergence to compare two collections of probabilistic predictions produced by different machine learning models trained using the relative entropy loss. In addition to the introduction of this technical concept, we provide a survey. It outlines the basics of Bregman geometry, and motivated the Kullback–Leibler divergence using concepts from information theory. We also describe computational geometric algorithms that have been extended to this geometry, focusing on algorithms relevant for machine learning. |
| format | Article |
| id | doaj-art-c047930a098e4861b31d8ec7ab67d067 |
| institution | Kabale University |
| issn | 2504-4990 |
| language | English |
| publishDate | 2025-05-01 |
| publisher | MDPI AG |
| record_format | Article |
| series | Machine Learning and Knowledge Extraction |
| spelling | doaj-art-c047930a098e4861b31d8ec7ab67d0672025-08-20T03:27:28ZengMDPI AGMachine Learning and Knowledge Extraction2504-49902025-05-01724810.3390/make7020048Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine LearningTuyen Pham0Hana Dal Poz Kouřimská1Hubert Wagner2Department of Mathematics, University of Florida, Gainesville, FL 32611, USAApplied Geometry and Topology, University of Potsdam, Am Neuen Palais 10, 14469 Potsdam, GermanyDepartment of Mathematics, University of Florida, Gainesville, FL 32611, USAThe purpose of this paper is twofold. On a technical side, we propose an extension of the Hausdorff distance from metric spaces to spaces equipped with asymmetric distance measures. Specifically, we focus on extending it to the family of Bregman divergences, which includes the popular Kullback–Leibler divergence (also known as relative entropy). The resulting dissimilarity measure is called a Bregman–Hausdorff divergence and compares two collections of vectors—without assuming any pairing or alignment between their elements. We propose new algorithms for computing Bregman–Hausdorff divergences based on a recently developed Kd-tree data structure for nearest neighbor search with respect to Bregman divergences. The algorithms are surprisingly efficient even for large inputs with hundreds of dimensions. As a benchmark, we use the new divergence to compare two collections of probabilistic predictions produced by different machine learning models trained using the relative entropy loss. In addition to the introduction of this technical concept, we provide a survey. It outlines the basics of Bregman geometry, and motivated the Kullback–Leibler divergence using concepts from information theory. We also describe computational geometric algorithms that have been extended to this geometry, focusing on algorithms relevant for machine learning.https://www.mdpi.com/2504-4990/7/2/48computational geometrymachine learninginformation theoryShannon’s entropyrelative entropyBregman divergence |
| spellingShingle | Tuyen Pham Hana Dal Poz Kouřimská Hubert Wagner Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning Machine Learning and Knowledge Extraction computational geometry machine learning information theory Shannon’s entropy relative entropy Bregman divergence |
| title | Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning |
| title_full | Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning |
| title_fullStr | Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning |
| title_full_unstemmed | Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning |
| title_short | Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning |
| title_sort | bregman hausdorff divergence strengthening the connections between computational geometry and machine learning |
| topic | computational geometry machine learning information theory Shannon’s entropy relative entropy Bregman divergence |
| url | https://www.mdpi.com/2504-4990/7/2/48 |
| work_keys_str_mv | AT tuyenpham bregmanhausdorffdivergencestrengtheningtheconnectionsbetweencomputationalgeometryandmachinelearning AT hanadalpozkourimska bregmanhausdorffdivergencestrengtheningtheconnectionsbetweencomputationalgeometryandmachinelearning AT hubertwagner bregmanhausdorffdivergencestrengtheningtheconnectionsbetweencomputationalgeometryandmachinelearning |