Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
We study the convergence rate of Bregman gradient methods for convex optimization in the space of measures on a $d$-dimensional manifold. Under basic regularity assumptions, we show that the suboptimality gap at iteration $k$ is in $O(\log (k)k^{-1})$ for multiplicative updates, while it is in $O(k^...
Saved in:
Main Author: | Chizat, Lénaïc |
---|---|
Format: | Article |
Language: | English |
Published: |
Université de Montpellier
2023-01-01
|
Series: | Open Journal of Mathematical Optimization |
Subjects: | |
Online Access: | https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.20/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Generalized Difference Lacunary Weak Convergence of Sequences
by: Bibhajyoti Tamuli, et al.
Published: (2024-03-01) -
Separation ratios of maps between Banach spaces
by: Rosendal, Christian
Published: (2023-11-01) -
Generalized Grand Lebesgue Spaces Associated to Banach Function spaces
by: Alireza Bagheri Salec, et al.
Published: (2024-07-01) -
On $p$-convexification of the Banach-Kantorovich lattice
by: Gavhar B. Zakirova
Published: (2024-12-01) -
$\lambda$-Statistical Pointwise and Uniform Convergence of Sequences of Functions on Neutrosophic Normed Spaces
by: Hari Shankar, et al.
Published: (2025-01-01)