Tight analyses for subgradient descent I: Lower bounds
Consider the problem of minimizing functions that are Lipschitz and convex, but not necessarily differentiable. We construct a function from this class for which the $Tþ$ iterate of subgradient descent has error $\Omega (\log (T)/\sqrt{T})$. This matches a known upper bound of $O(\log (T)/\sqrt{T})$...
Saved in:
Main Authors: | Harvey, Nicholas J. A., Liaw, Chris, Randhawa, Sikander |
---|---|
Format: | Article |
Language: | English |
Published: |
Université de Montpellier
2024-07-01
|
Series: | Open Journal of Mathematical Optimization |
Subjects: | |
Online Access: | https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.31/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Smoothing gradient descent algorithm for the composite sparse optimization
by: Wei Yang, et al.
Published: (2024-11-01) -
Forest fire risk assessment model optimized by stochastic average gradient descent
by: Zexin Fu, et al.
Published: (2025-01-01) -
Descent among the Wayú. Concepts and social meanings
by: Alessandro Mancuso
Published: (2008-07-01) -
Comparison of the efficiency of zero and first order minimization methods in neural networks
by: E. A. Gubareva, et al.
Published: (2022-12-01) -
Upper and lower bounds for the blow-up time of a fourth-order parabolic equation with exponential nonlinearity
by: Shuting Chang, et al.
Published: (2024-11-01)