A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications

Type: Article

Publication Date: 2016-11-11

Citations: 313

DOI: https://doi.org/10.1287/moor.2016.0817

Abstract

The proximal gradient and its variants is one of the most attractive first-order algorithm for minimizing the sum of two convex functions, with one being nonsmooth. However, it requires the differentiable part of the objective to have a Lipschitz continuous gradient, thus precluding its use in many applications. In this paper we introduce a framework which allows to circumvent the intricate question of Lipschitz continuity of gradients by using an elegant and easy to check convexity condition which captures the geometry of the constraints. This condition translates into a new descent lemma which in turn leads to a natural derivation of the proximal-gradient scheme with Bregman distances. We then identify a new notion of asymmetry measure for Bregman distances, which is central in determining the relevant step-size. These novelties allow to prove a global sublinear rate of convergence, and as a by-product, global pointwise convergence is obtained. This provides a new path to a broad spectrum of problems arising in key applications which were, until now, considered as out of reach via proximal gradient methods. We illustrate this potential by showing how our results can be applied to build new and simple schemes for Poisson inverse problems.

Locations

  • Toulouse 1 Capitole Publications (Université Toulouse I Capitole) - View - PDF
  • Toulouse Capitole Publications (University Toulouse 1 Capitole) - View - PDF
  • Mathematics of Operations Research - View

Similar Works

Action Title Year Authors
+ First Order Methods beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems 2017 Jérôme Bolte
Shoham Sabach
Marc Teboulle
Yakov Vaisbourd
+ First Order Methods beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems 2017 Jérôme Bolte
Shoham Sabach
Marc Teboulle
Yakov Vaisbourd
+ PDF Chat First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems 2018 Jérôme Bolte
Shoham Sabach
Marc Teboulle
Yakov Vaisbourd
+ First-Order Methods for Convex Optimization 2021 Pavel Dvurechensky
Mathias Staudigl
Shimrit Shtern
+ First-Order Methods for Convex Optimization 2021 Pavel Dvurechensky
Mathias Staudigl
Shimrit Shtern
+ First-Order Methods for Convex Optimization 2021 Pavel Dvurechensky
Shimrit Shtern
Mathias Staudigl
+ First-order noneuclidean splitting methods for large-scale optimization : deterministic and stochastic algorithms 2021 Antonio Silveti-Falls
+ Error bounds, quadratic growth, and linear convergence of proximal methods 2016 Dmitriy Drusvyatskiy
Adrian S. Lewis
+ Convergent Bregman Plug-and-Play Image Restoration for Poisson Inverse Problems 2023 Samuel Hurault
Ulugbek S. Kamilov
Arthur Leclaire
Nicolas Papadakis
+ Error bounds, quadratic growth, and linear convergence of proximal methods 2016 Dmitriy Drusvyatskiy
Adrian S. Lewis
+ PDF Chat Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods 2018 Dmitriy Drusvyatskiy
Adrian S. Lewis
+ Delay-tolerant distributed Bregman proximal algorithms 2024 Sélim Chraibi
Franck Iutzeler
Jérôme Malick
Alexander Rogozin
+ PDF Chat Delay-tolerant distributed Bregman proximal algorithms 2024 Sélim Chraibi
Franck Iutzeler
Jérôme Malick
Alexander Rogozin
+ BISTA: a Bregmanian proximal gradient method without the global Lipschitz continuity assumption. 2018 Daniel Reem
Simeon Reich
Alvaro R. De Pierro
+ Proximal Gradient Methods for Machine Learning and Imaging 2021 Saverio Salzo
Silvia Villa
+ PDF Chat Proximal or gradient steps for cocoercive operators 2021 Luis M. Briceño-Arias
Nelly Pustelnik
+ PDF Chat Monotone Lipschitz-Gradient Denoiser: Explainability of Operator Regularization Approaches and Convergence to Optimal Point 2024 Masahiro Yukawa
Isao Yamada
+ Semi-proximal Mirror-Prox for Nonsmooth Composite Minimization 2015 Niao He
Zaïd Harchaoui
+ PDF Chat Proximal Splitting Derivatives for Risk Estimation 2012 Charles‐Alban Deledalle
Samuel Vaiter
Gabriel Peyré
Jalal Fadili
Ch. Dossal
+ Level-Set Subdifferential Error Bounds and Linear Convergence of Bregman Proximal Gradient Method 2021 Daoli Zhu
Sien Deng
Minghua Li
Lei Zhao

Works That Cite This (277)

Action Title Year Authors
+ PDF Chat Accelerated Bregman proximal gradient methods for relatively smooth convex optimization 2021 Filip Hanzely
Peter Richtárik
Lin Xiao
+ A Unified Convergence Analysis of Stochastic Bregman Proximal Gradient and Extragradient Methods 2021 Xiantao Xiao
+ Stochastic Mirror Descent for Low-Rank Tensor Decomposition Under Non-Euclidean Losses 2022 Wenqiang Pu
Shahana Ibrahim
Xiao Fu
Mingyi Hong
+ Bregman Proximal Gradient Algorithm with Extrapolation for a class of Nonconvex Nonsmooth Minimization Problems 2019 Xiaoya Zhang
Roberto Barrio
M. A. Martínez
Hao Jiang
Lizhi Cheng
+ PDF Chat On inexact solution of auxiliary problems in tensor methods for convex optimization 2020 Geovani Nunes Grapiglia
Yu. Nesterov
+ Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization 2019 Masoud Ahookhosh
Le Thi Khanh Hien
Nicolas Gillis
Panagiotis Patrinos
+ Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Non-Convex Optimization 2019 Mahesh Chandra Mukkamala
Peter Ochs
Thomas Pock
Shoham Sabach
+ Nonlinear Forward-Backward Splitting with Projection Correction 2019 Pontus Giselsson
+ PDF Chat Some Adaptive First-Order Methods for Variational Inequalities with Relatively Strongly Monotone Operators and Generalized Smoothness 2022 Seydamet S. Ablaev
A. Titov
Fedor Stonyakin
Mohammad Alkousa
Alexander Gasnikov
+ PDF Chat Screening for a Reweighted Penalized Conditional Gradient Method 2022 Yifan Sun
Francis Bach