Adaptive proximal gradient methods are universal without approximation

Type: Preprint

Publication Date: 2024-02-09

Citations: 0

DOI: https://doi.org/10.48550/arxiv.2402.06271

View Chat PDF

Abstract

We show that adaptive proximal gradient methods for convex problems are not restricted to traditional Lipschitzian assumptions. Our analysis reveals that a class of linesearch-free methods is still convergent under mere local H\"older gradient continuity, covering in particular continuously differentiable semi-algebraic functions. To mitigate the lack of local Lipschitz continuity, popular approaches revolve around $\varepsilon$-oracles and/or linesearch procedures. In contrast, we exploit plain H\"older inequalities not entailing any approximation, all while retaining the linesearch-free nature of adaptive schemes. Furthermore, we prove full sequence convergence without prior knowledge of local H\"older constants nor of the order of H\"older continuity. In numerical experiments we present comparisons to baseline methods on diverse tasks from machine learning covering both the locally and the globally H\"older setting.

Locations

  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ Adaptive proximal algorithms for convex optimization under local Lipschitz continuity of the gradient 2024 Puya Latafat
Andreas Themelis
Lorenzo Stella
Panagiotis Patrinos
+ Adaptive proximal algorithms for convex optimization under local Lipschitz continuity of the gradient 2023 Puya Latafat
Andreas Themelis
Lorenzo Stella
Panagiotis Patrinos
+ Adaptive Proximal Gradient Method for Convex Optimization 2023 Yura Malitsky
Konstantin Mishchenko
+ Efficient implementation of incremental proximal-point methods 2022 Alex Shtoff
+ Adaptive Catalyst for Smooth Convex Optimization 2019 Anastasiya Ivanova
Dmitry Pasechnyuk
Dmitry Grishchenko
Egor Shulgin
Alexander Gasnikov
Vladislav Matyukhin
+ PDF Chat Adaptive Gradient Methods for Constrained Convex Optimization and Variational Inequalities 2021 Alina Ene
Huy L. Nguyá»…n
Adrian Vladu
+ AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods 2021 Zheng Shi
Abdurakhmon Sadiev
Nicolas Loizou
Peter Richtárik
Martin Takáč
+ Adaptive Gradient Methods for Constrained Convex Optimization and Variational Inequalities 2020 Alina Ene
Huy L. Nguyá»…n
Adrian Vladu
+ PDF Chat Proximal Gradient Algorithms Under Local Lipschitz Gradient Continuity 2022 Alberto De Marchi
Andreas Themelis
+ PDF Chat Safeguarding adaptive methods: global convergence of Barzilai-Borwein and other stepsize choices 2024 Ou Hongjia
Andreas Themelis
+ PDF Chat Fundamental Benefit of Alternating Updates in Minimax Optimization 2024 Jaewook Lee
Hanseul Cho
Chulhee Yun
+ Proximal Gradient Dynamics: Monotonicity, Exponential Convergence, and Applications 2024 Anand Gokhale
A. S. Davydov
Francesco Bullo
+ Escaping Saddle Points with Adaptive Gradient Methods 2019 Matthew Staib
Sashank J. Reddi
Satyen Kale
Sanjiv Kumar
Suvrit Sra
+ PDF Chat Acceleration Methods 2021 Alexandre d’Aspremont
Damien Scieur
Adrien Taylor
+ Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice 2017 Hongzhou Lin
Julien Mairal
ZaĂŻd Harchaoui
+ On the convergence of adaptive first order methods: proximal gradient and alternating minimization algorithms 2023 Puya Latafat
Andreas Themelis
Panagiotis Patrinos
+ Adaptive and Universal Algorithms for Variational Inequalities with Optimal Convergence 2020 Alina Ene
Huy L. Nguyá»…n
+ Adaptive Gradient Methods for Constrained Convex Optimization. 2020 Alina Ene
Huy L. Nguyá»…n
Adrian Vladu
+ Proximal Algorithms 2013 Neal Parikh
Stephen Boyd
+ Adaptive Gradient Descent without Descent 2019 Konstantin Mishchenko
Yura Malitsky

Cited by (0)

Action Title Year Authors

Citing (0)

Action Title Year Authors