Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures

We study the convergence rate of Bregman gradient methods for convex optimization in the space of measures on a $d$-dimensional manifold. Under basic regularity assumptions, we show that the suboptimality gap at iteration $k$ is in $O(\log (k)k^{-1})$ for multiplicative updates, while it is in $O(k^...

Full description

Saved in:
Bibliographic Details
Main Author: Chizat, Lénaïc
Format: Article
Language:English
Published: Université de Montpellier 2023-01-01
Series:Open Journal of Mathematical Optimization
Subjects:
Online Access:https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.20/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1825205024043565056
author Chizat, Lénaïc
author_facet Chizat, Lénaïc
author_sort Chizat, Lénaïc
collection DOAJ
description We study the convergence rate of Bregman gradient methods for convex optimization in the space of measures on a $d$-dimensional manifold. Under basic regularity assumptions, we show that the suboptimality gap at iteration $k$ is in $O(\log (k)k^{-1})$ for multiplicative updates, while it is in $O(k^{-q/(d+q)})$ for additive updates for some $q\in \lbrace 1,2,4\rbrace $ determined by the structure of the objective function. Our flexible proof strategy, based on approximation arguments, allows us to painlessly cover all Bregman Proximal Gradient Methods (PGM) and their acceleration (APGM) under various geometries such as the hyperbolic entropy and $L^p$ divergences. We also prove the tightness of our analysis with matching lower bounds and confirm the theoretical results with numerical experiments on low dimensional problems. Note that all these optimization methods must additionally pay the computational cost of discretization, which can be exponential in $d$.
format Article
id doaj-art-c635d1c8f7c44f52a9cf3120fb43dc05
institution Kabale University
issn 2777-5860
language English
publishDate 2023-01-01
publisher Université de Montpellier
record_format Article
series Open Journal of Mathematical Optimization
spelling doaj-art-c635d1c8f7c44f52a9cf3120fb43dc052025-02-07T14:02:44ZengUniversité de MontpellierOpen Journal of Mathematical Optimization2777-58602023-01-01311910.5802/ojmo.2010.5802/ojmo.20Convergence Rates of Gradient Methods for Convex Optimization in the Space of MeasuresChizat, Lénaïc0Institute of Mathematics École polytechnique fédérale de Lausanne (EPFL), SwitzerlandWe study the convergence rate of Bregman gradient methods for convex optimization in the space of measures on a $d$-dimensional manifold. Under basic regularity assumptions, we show that the suboptimality gap at iteration $k$ is in $O(\log (k)k^{-1})$ for multiplicative updates, while it is in $O(k^{-q/(d+q)})$ for additive updates for some $q\in \lbrace 1,2,4\rbrace $ determined by the structure of the objective function. Our flexible proof strategy, based on approximation arguments, allows us to painlessly cover all Bregman Proximal Gradient Methods (PGM) and their acceleration (APGM) under various geometries such as the hyperbolic entropy and $L^p$ divergences. We also prove the tightness of our analysis with matching lower bounds and confirm the theoretical results with numerical experiments on low dimensional problems. Note that all these optimization methods must additionally pay the computational cost of discretization, which can be exponential in $d$.https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.20/Gradient DescentConvergence rateSpace of measuresConvex OptimizationBanach space
spellingShingle Chizat, Lénaïc
Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
Open Journal of Mathematical Optimization
Gradient Descent
Convergence rate
Space of measures
Convex Optimization
Banach space
title Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
title_full Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
title_fullStr Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
title_full_unstemmed Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
title_short Convergence Rates of Gradient Methods for Convex Optimization in the Space of Measures
title_sort convergence rates of gradient methods for convex optimization in the space of measures
topic Gradient Descent
Convergence rate
Space of measures
Convex Optimization
Banach space
url https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.20/
work_keys_str_mv AT chizatlenaic convergenceratesofgradientmethodsforconvexoptimizationinthespaceofmeasures