First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems

Type: Article

Publication Date: 2018-01-01

Citations: 156

DOI: https://doi.org/10.1137/17m1138558

Abstract

We focus on nonconvex and nonsmooth minimization problems with a composite objective, where the differentiable part of the objective is freed from the usual and restrictive global Lipschitz gradient continuity assumption. This longstanding smoothness restriction is pervasive in first order methods, and recently was circumvented for convex composite optimization by Bauschke, Bolte, and Teboulle, through a simple framework which captures, all at once, the geometry of the function and of the feasible set. Building on this work, we tackle genuine nonconvex problems. We first complement and extend their approach to derive an extended descent lemma by introducing the notion of smooth adaptable functions. We then consider a Bregman-based proximal gradient method for the nonconvex composite model with smooth adaptable functions, which is proven to globally converge to a critical point under natural assumptions on the problem's data, and in particular for semialgebraic problems. To illustrate the potential of our general framework and results, we consider a broad class of quadratic inverse problems with sparsity constraints which arises in many fundamental applications, and we apply our approach to derive new globally convergent schemes for this class.

Locations

  • arXiv (Cornell University) - View - PDF
  • Toulouse Capitole Publications (University Toulouse 1 Capitole) - View - PDF
  • SIAM Journal on Optimization - View

Similar Works

Action Title Year Authors
+ First Order Methods beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems 2017 J茅r么me Bolte
Shoham Sabach
Marc Teboulle
Yakov Vaisbourd
+ First Order Methods beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems 2017 J茅r么me Bolte
Shoham Sabach
Marc Teboulle
Yakov Vaisbourd
+ PDF Chat A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications 2016 Heinz H. Bauschke
J茅r么me Bolte
Marc Teboulle
+ First-Order Methods for Convex Optimization 2021 Pavel Dvurechensky
Mathias Staudigl
Shimrit Shtern
+ First-Order Methods for Convex Optimization 2021 Pavel Dvurechensky
Mathias Staudigl
Shimrit Shtern
+ First-Order Methods for Convex Optimization 2021 Pavel Dvurechensky
Shimrit Shtern
Mathias Staudigl
+ Theoretical and numerical comparison of first-order algorithms for cocoercive equations and smooth convex optimization 2021 Luis M. Brice帽o-Arias
Nelly Pustelnik
+ PDF Chat Theoretical and numerical comparison of first order algorithms for cocoercive equations and smooth convex optimization 2022 Luis M. Brice帽o-Arias
Nelly Pustelnik
+ PDF Chat First-Order Methods for Nonconvex Quadratic Minimization 2020 Yair Carmon
John C. Duchi
+ Revisiting Linearized Bregman Iterations under Lipschitz-like Convexity Condition 2022 Hui Zhang
Lu Zhang
Haoxing Yang
+ Alternating and Parallel Proximal Gradient Methods for Nonsmooth, Nonconvex Minimax: A Unified Convergence Analysis 2024 Eyal Cohen
Marc Teboulle
+ PDF Chat Black-box Optimization Algorithms for Regularized Least-squares Problems 2024 Y. Liu
K.C. Lam
Lindon Roberts
+ Revisiting linearized Bregman iterations under Lipschitz-like convexity condition 2022 Hui Zhang
Lu Zhang
Haoxing Yang
+ PDF Chat Proximal or gradient steps for cocoercive operators 2021 Luis M. Brice帽o-Arias
Nelly Pustelnik
+ Bregman forward-backward splitting for nonconvex composite optimization: superlinear convergence to nonisolated critical points 2019 Masoud Ahookhosh
Andreas Themelis
Panagiotis Patrinos
+ A Proximal Quasi-Newton Trust-Region Method for Nonsmooth Regularized Optimization 2021 Aleksandr Y. Aravkin
Robert Baraldi
Dominique Orban
+ Proximal envelopes: Smooth optimization algorithms for nonsmooth problems 2017 Lorenzo Stella
+ Linearization Algorithms for Fully Composite Optimization 2023 Maria-Luiza Vladarean
Nikita Doikov
Martin Jaggi
Nicolas Flammarion
+ Global Convergence of Model Function Based Bregman Proximal Minimization Algorithms 2020 Mahesh Chandra Mukkamala
Jalal Fadili
Peter Ochs
+ Strong Metric (Sub)regularity of KKT Mappings for Piecewise Linear-Quadratic Convex-Composite Optimization 2018 James V. Burke
Abraham Engle

Works That Cite This (137)

Action Title Year Authors
+ PDF Chat Variable Metric Forward-Backward Algorithm for Composite Minimization Problems 2021 Audrey Repetti
Yves Wiaux
+ A Unified Convergence Analysis of Stochastic Bregman Proximal Gradient and Extragradient Methods 2021 Xiantao Xiao
+ Stochastic Mirror Descent for Low-Rank Tensor Decomposition Under Non-Euclidean Losses 2022 Wenqiang Pu
Shahana Ibrahim
Xiao Fu
Mingyi Hong
+ Convergent Plug-and-Play with Proximal Denoiser and Unconstrained Regularization Parameter 2024 Samuel Hurault
Antonin Chambolle
Arthur Leclaire
Nicolas Papadakis
+ A Bregman inertial forward-reflected-backward method for nonconvex minimization 2023 Xianfu Wang
Ziyuan Wang
+ Bregman Proximal Gradient Algorithm with Extrapolation for a class of Nonconvex Nonsmooth Minimization Problems 2019 Xiaoya Zhang
Roberto Barrio
M. A. Mart铆nez
Hao Jiang
Lizhi Cheng
+ PDF Chat On inexact solution of auxiliary problems in tensor methods for convex optimization 2020 Geovani Nunes Grapiglia
Yu. Nesterov
+ Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization 2019 Masoud Ahookhosh
Le Thi Khanh Hien
Nicolas Gillis
Panagiotis Patrinos
+ Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Non-Convex Optimization 2019 Mahesh Chandra Mukkamala
Peter Ochs
Thomas Pock
Shoham Sabach
+ On $\ell_{0}$ Bregman-Relaxations for Kullback-Leibler Sparse Regressionx 2024 M'hamed Essafri
Luca Calatroni
Emmanuel Soubies