A note on approximate accelerated forward-backward methods with absolute and relative errors, and possibly strongly convex objectives
In this short note, we provide a simple version of an accelerated forward-backward method (a.k.a. Nesterov’s accelerated proximal gradient method) possibly relying on approximate proximal operators and allowing to exploit strong convexity of the objective function. The method supports both relative...
Saved in:
Main Authors: | Barré, Mathieu, Taylor, Adrien, Bach, Francis |
---|---|
Format: | Article |
Language: | English |
Published: |
Université de Montpellier
2022-01-01
|
Series: | Open Journal of Mathematical Optimization |
Online Access: | https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.12/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Feedback and formative assessment—looking backwards to move forward
by: Nora McCarthy, et al.
Published: (2025-02-01) -
An Accelerated Successive Convex Approximation Scheme With Exact Step Sizes for L1-Regression
by: Lukas Schynol, et al.
Published: (2025-01-01) -
A note on h-convex functions
by: Mohammad W. Alomari
Published: (2019-12-01) -
Characterizations of Stability of Error Bounds for Convex Inequality Constraint Systems
by: Wei, Zhou, et al.
Published: (2022-03-01) -
Absolutely on Music /
by: Murakami, Haruki
Published: (2016)