Tight analyses for subgradient descent I: Lower bounds

Consider the problem of minimizing functions that are Lipschitz and convex, but not necessarily differentiable. We construct a function from this class for which the $Tþ$ iterate of subgradient descent has error $\Omega (\log (T)/\sqrt{T})$. This matches a known upper bound of $O(\log (T)/\sqrt{T})$...

Full description

Saved in:
Bibliographic Details
Main Authors: Harvey, Nicholas J. A., Liaw, Chris, Randhawa, Sikander
Format: Article
Language:English
Published: Université de Montpellier 2024-07-01
Series:Open Journal of Mathematical Optimization
Subjects:
Online Access:https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.31/
Tags: Add Tag
No Tags, Be the first to tag this record!