Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
We study the convergence rate of Bregman gradient methods for convex optimization in the space of measures on a $d$-dimensional manifold. Under basic regularity assumptions, we show that the suboptimality gap at iteration $k$ is in $O(\log (k)k^{-1})$ for multiplicative updates, while it is in $O(k^...
Saved in:
| Main Author: | Chizat, Lénaïc |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Université de Montpellier
2023-01-01
|
| Series: | Open Journal of Mathematical Optimization |
| Subjects: | |
| Online Access: | https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.20/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
The Kluvanek-Kantorovitz characterization of scalar operators in locally convex spaces
by: William V. Smith
Published: (1982-01-01) -
Variance Reduction Optimization Algorithm Based on Random Sampling
by: GUO Zhenhua, YAN Ruidong, QIU Zhiyong, ZHAO Yaqian, LI Rengang
Published: (2025-03-01) -
Convergence of Panigrahy iteration process for Suzuki generalized nonexpansive mapping in uniformly convex Banach space
by: Omprash Sahu, et al.
Published: (2024-06-01) -
Convergence of Ishikawa iterates of generalized nonexpansive mappings
by: M. K. Ghosh, et al.
Published: (1997-01-01) -
N*-Iteration Approach for Approximation of Fixed Points in Uniformly Convex Banach Space
by: Raghad I. Sabri
Published: (2025-01-01)