A geometric alternative to Nesterov's accelerated gradient descent

Type: Preprint

Publication Date: 2015-06-26

Citations: 98

View

Locations

  • arXiv (Cornell University) - View

Similar Works

Action Title Year Authors
+ A geometric alternative to Nesterov's accelerated gradient descent 2015 Sébastien Bubeck
Yin Tat Lee
Mohit Singh
+ Nesterov Acceleration for Riemannian Optimization 2022 Jungbin Kim
Insoon Yang
+ PDF Chat Optimal Convergence Rates for Nesterov Acceleration 2019 Jean–François Aujol
Charles Dossal
Aude Rondepierre
+ Optimal convergence rates for Nesterov acceleration 2018 Jean François Aujol
Charles Dossal
Aude Rondepierre
+ PDF Chat Accelerated Bregman proximal gradient methods for relatively smooth convex optimization 2021 Filip Hanzely
Peter Richtárik
Lin Xiao
+ Accelerated Bregman Proximal Gradient Methods for Relatively Smooth Convex Optimization 2018 Filip Hanzely
Peter Richtárik
Lin Xiao
+ Optimal convergence rates for Nesterov acceleration 2018 Jean François Aujol
Charles Dossal
Aude Rondepierre
+ Understanding Nesterov's Acceleration via Proximal Point Method 2020 Kwangjun Ahn
Suvrit Sra
+ Nesterov Acceleration for Equality-Constrained Convex Optimization via Continuously Differentiable Penalty Functions 2020 Priyank Srivastava
Jorge Cortés
+ A single potential governing convergence of conjugate gradient, accelerated gradient and geometric descent 2017 Sahar Karimi
Stephen A. Vavasis
+ A single potential governing convergence of conjugate gradient, accelerated gradient and geometric descent 2017 Sahar Karimi
Stephen A. Vavasis
+ Towards Riemannian Accelerated Gradient Methods 2018 Hongyi Zhang
Suvrit Sra
+ "Convex Until Proven Guilty": Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions 2017 Yair Carmon
Oliver Hinder
John C. Duchi
Aaron Sidford
+ "Convex Until Proven Guilty": Dimension-Free Acceleration of Gradient Descent on Non-Convex Functions 2017 Yair Carmon
Oliver Hinder
John C. Duchi
Aaron Sidford
+ Convex until proven guilty: dimension-free acceleration of gradient descent on non-convex functions 2017 Yair Carmon
John C. Duchi
Oliver Hinder
Aaron Sidford
+ An accelerated minimal gradient method with momentum for strictly convex quadratic optimization 2021 Harry Oviedo
Oscar Dalmau
Rafael Herrera
+ NEON+: Accelerated Gradient Methods for Extracting Negative Curvature for Non-Convex Optimization 2017 Yi Xu
Rong Jin
Tianbao Yang
+ A novel, simple interpretation of Nesterov’s accelerated method as a combination of gradient and mirror descent 2014 Zeyuan Allen Zhu
Lorenzo Orecchia
+ PDF Chat Accelerated optimization algorithms and ordinary differential equations: the convex non Euclidean case 2024 Paul Dobson
J. M. Sanz‐Serna
Konstantinos C. Zygalakis
+ PDF Chat A Generalized Accelerated Composite Gradient Method: Uniting Nesterov's Fast Gradient Method and FISTA 2020 Mihai I. Florea
Sergiy A. Vorobyov

Cited by (86)

Action Title Year Authors
+ PDF Chat Efficient first-order methods for convex minimization: a constructive approach 2019 Yoel Drori
Adrien Taylor
+ PDF Chat Continuous-Time Accelerated Methods via a Hybrid Control Lens 2019 Arman Sharifi Kolarijani
Peyman Mohajerin Esfahani
Tamás Keviczky
+ PDF Chat A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip 2021 Mathieu Even
Raphaël Berthier
Francis Bach
Nicolas Flammarion
Pierre Gaillard
Hadrien Hendrikx
Laurent Massoulié
Adrien Taylor
+ PDF Chat Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences 2021 Majid Jahani
Naga V. C. Gudapati
Chenxin Ma
Rachael Tappenden
Martin Takáč
+ Variational PDEs for Acceleration on Manifolds and Application to Diffeomorphisms 2018 Ganesh Sundaramoorthi
Anthony Yezzi
+ Even faster accelerated coordinate descent using non-uniform sampling 2016 Zeyuan Allen-Zhu
Zheng Qu
Peter Richtárik
Yuan Yang
+ Acceleration in First Order Quasi-strongly Convex Optimization by ODE Discretization 2019 Jingzhao Zhang
Suvrit Sra
Ali Jadbabaie
+ A Unifying Framework of Accelerated First-Order Approach to Strongly Monotone Variational Inequalities 2021 Kevin Huang
Shuzhong Zhang
+ Chebyshev Center of the Intersection of Balls: Complexity, Relaxation and Approximation 2019 Yong Xia
Meijia Yang
Shu Wang
+ A variable metric mini-batch proximal stochastic recursive gradient algorithm with diagonal Barzilai-Borwein stepsize 2020 Tengteng Yu
Xinwei Liu
Yu‐Hong Dai
Jie Sun
+ PDF Chat A Faster Cutting Plane Method and its Implications for Combinatorial and Convex Optimization 2015 Yin Tat Lee
Aaron Sidford
Sam Chiu-wai Wong
+ Generalized Momentum-Based Methods: A Hamiltonian Perspective 2019 Jelena Diakonikolas
Michael I. Jordan
+ A Dynamical Systems Perspective on Nesterov Acceleration 2019 Michael Muehlebach
Michael I. Jordan
+ Asymptotic Analysis via Stochastic Differential Equations of Gradient Descent Algorithms in Statistical and Computational Paradigms 2017 Yazhen Wang
+ Potential Function-based Framework for Making the Gradients Small in Convex and Min-Max Optimization 2021 Jelena Diakonikolas
Puqian Wang
+ PDF Chat Proximal methods for sparse optimal scoring and discriminant analysis 2022 Summer Atkins
Guðmundur Einarsson
Line Katrine Harder Clemmensen
Brendan Ames
+ Tensor optimal transport, distance between sets of measures and tensor scaling 2020 Shmuel Friedland
+ Potential-based analyses of first-order methods for constrained and composite optimization 2019 Courtney Paquette
Stephen A. Vavasis
+ PDF Chat The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods 2019 Jelena Diakonikolas
Lorenzo Orecchia
+ The Fastest Known Globally Convergent First-Order Method for Minimizing Strongly Convex Functions 2017 Bryan Van Scoy
Randy A. Freeman
Kevin Lynch
+ The condition number of a function relative to a set 2019 David H. Gutman
Javier Peña
+ Convergence of first-order methods via the convex conjugate 2017 Javier Peña
+ Convergence rates of proximal gradient methods via the convex conjugate 2018 David H. Gutman
Javier Peña
+ Perturbed Fenchel duality and first-order methods 2018 David H. Gutman
Javier Peña
+ The condition of a function relative to a polytope 2018 David H. Gutman
Javier Peña
+ Direct Runge-Kutta Discretization Achieves Acceleration 2018 Jingzhao Zhang
Aryan Mokhtari
Suvrit Sra
Ali Jadbabaie
+ PDF Chat Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms 2022 Ganesh Sundaramoorthi
Anthony Yezzi
Minas Benyamin
+ PDF Chat Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization 2017 Adrien Taylor
Julien M. Hendrickx
François Glineur
+ A New Class of Composite Objective Multi-step Estimating-sequence Techniques (COMET) 2021 Endrit Dosti
Sergiy A. Vorobyov
Themistoklis Charalambous
+ Accelerated Methods for Non-Convex Optimization 2016 Yair Carmon
John C. Duchi
Oliver Hinder
Aaron Sidford
+ Accelerated Gradient Descent Escapes Saddle Points Faster than Gradient Descent 2017 Chi Jin
Praneeth Netrapalli
Michael I. Jordan
+ An Optimal High-Order Tensor Method for Convex Optimization 2018 Bo Jiang
Haoyue Wang
Shuzhong Zhang
+ Dissipativity Theory for Nesterov's Accelerated Method 2017 Bin Hu
Laurent Lessard
+ A Variational Perspective on Accelerated Methods in Optimization 2016 Andre Wibisono
Ashia C. Wilson
Michael I. Jordan
+ The Common-directions Method for Regularized Empirical Risk Minimization 2019 Po-Wei Wang
Ching-pei Lee
Chih‐Jen Lin
+ Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond 2019 Oliver Hinder
Aaron Sidford
Nimit S. Sohoni
+ Fast and Safe: Accelerated gradient methods with optimality certificates and underestimate sequences 2017 Majid Jahani
Naga V. C. Gudapati
Chenxin Ma
Rachael Tappenden
Martin Takáč
+ An Explicit Convergence Rate for Nesterov's Method from SDP 2018 Sam Safavi
Bikash Joshi
Guilherme França
José Bento
+ An optimal first order method based on optimal quadratic averaging 2016 Dmitriy Drusvyatskiy
Maryam Fazel
Scott Roy
+ Acceleration Methods 2021 Alexandre d’Aspremont
Damien Scieur
Adrien Taylor