An Accelerated Successive Convex Approximation Scheme With Exact Step Sizes for L1-Regression

We consider the minimization of <inline-formula><tex-math notation="LaTeX">$\ell _{1}$</tex-math></inline-formula>-regularized least-squares problems. A recent optimization approach uses successive convex approximations with an exact line search, which is highly com...

Full description

Saved in:
Bibliographic Details
Main Authors: Lukas Schynol, Moritz Hemsing, Marius Pesavento
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Open Journal of Signal Processing
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10840211/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We consider the minimization of <inline-formula><tex-math notation="LaTeX">$\ell _{1}$</tex-math></inline-formula>-regularized least-squares problems. A recent optimization approach uses successive convex approximations with an exact line search, which is highly competitive, especially in sparse problem instances. This work proposes an acceleration scheme for the successive convex approximation technique with a negligible additional computational cost. We demonstrate this scheme by devising three related accelerated algorithms with provable convergence. The first introduces an additional descent step along the past optimization trajectory in the variable update that is inspired by Nesterov&#x0027;s accelerated gradient method and uses a closed-form step size. The second performs a simultaneous descent step along both the best response and the past trajectory, thereby finding a two-dimensional step size, also in closed-form. The third algorithm combines the previous two approaches. All algorithms are hyperparameter-free. Empirical results confirm that the acceleration approaches improve the convergence rate compared to benchmark algorithms, and that they retain the benefits of successive convex approximation also in non-sparse instances.
ISSN:2644-1322