Bregman–Hausdorff Divergence: Strengthening the Connections Between Computational Geometry and Machine Learning
The purpose of this paper is twofold. On a technical side, we propose an extension of the Hausdorff distance from metric spaces to spaces equipped with asymmetric distance measures. Specifically, we focus on extending it to the family of Bregman divergences, which includes the popular Kullback–Leibl...
Saved in:
| Main Authors: | Tuyen Pham, Hana Dal Poz Kouřimská, Hubert Wagner |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-05-01
|
| Series: | Machine Learning and Knowledge Extraction |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2504-4990/7/2/48 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Equivalence of Informations Characterizes Bregman Divergences
by: Philip S. Chodrow
Published: (2025-07-01) -
Bregman divergences for physically informed discrepancy measures for learning and computation in thermomechanics
by: Andrieux, Stéphane
Published: (2023-02-01) -
Fast Proxy Centers for the Jeffreys Centroid: The Jeffreys–Fisher–Rao Center and the Gauss–Bregman Inductive Center
by: Frank Nielsen
Published: (2024-11-01) -
An information theoretic limit to data amplification
by: S J Watts, et al.
Published: (2025-01-01) -
An Intrinsic Characterization of Shannon’s and Rényi’s Entropy
by: Martin Schlather, et al.
Published: (2024-12-01)