Projects
Reading
People
Chat
SU\G
(𝔸)
/K·U
Projects
Reading
People
Chat
Sign Up
Sign In
Light
Dark
System
A differential equation for modeling Nesterov's accelerated gradient method: theory and insights
Weijie Su
,
Stephen Boyd
,
Emmanuel J. Candès
Type:
Article
Publication Date:
2016-01-01
Citations:
533
View Publication
Share
Locations
Journal of Machine Learning Research -
View
Similar Works
Action
Title
Year
Authors
+
A Differential Equation for Modeling Nesterov's Accelerated Gradient Method: Theory and Insights
2015
Weijie Su
Stephen Boyd
Emmanuel J. Candès
+
A Differential Equation for Modeling Nesterov’s Accelerated Gradient Method: Theory and Insights
2014
Weijie Su
Stephen Boyd
Emmanuel J. Candès
+
A differential equation for modeling Nesterov's accelerated gradient method
2016
SuWeijie
BoydStephen
J CandèsEmmanuel
+
Understanding the Acceleration Phenomenon via High-Resolution Differential Equations
2018
Bin Shi
Simon S. Du
Michael I. Jordan
Weijie Su
+
PDF
Chat
Understanding the acceleration phenomenon via high-resolution differential equations
2021
Bin Shi
Simon S. Du
Michael I. Jordan
Weijie Su
+
PDF
Chat
Generalized Continuous-Time Models for Nesterov's Accelerated Gradient Methods
2024
Chanwoong Park
Youngchae Cho
Insoon Yang
+
Nesterov's method with decreasing learning rate leads to accelerated stochastic gradient descent
2019
Maxime Laborde
Adam M. Oberman
+
Comparative analysis of accelerated gradient algorithms for convex optimization: high and super resolution ODE approach
2024
Samir Adly
Hédy Attouch
Jalal Fadili
+
Unifying Nesterov's Accelerated Gradient Methods for Convex and Strongly Convex Objective Functions: From Continuous-Time Dynamics to Discrete-Time Algorithms
2023
Jungbin Kim
Insoon Yang
+
PDF
Chat
Accelerated optimization algorithms and ordinary differential equations: the convex non Euclidean case
2024
Paul Dobson
J. M. Sanz‐Serna
Konstantinos C. Zygalakis
+
On Accelerated Methods in Optimization
2015
Andre Wibisono
Ashia C. Wilson
+
Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
2017
Jelena Diakonikolas
Lorenzo Orecchia
+
Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method
2017
Jelena Diakonikolas
Lorenzo Orecchia
+
Understanding Accelerated Gradient Methods: Lyapunov Analyses and Hamiltonian Assisted Interpretations
2023
Penghui Fu
Zhiqiang Tan
+
ODE Discretization Schemes as Optimization Algorithms
2022
Orlando Romero
Mouhacine Benosman
George J. Pappas
+
A Continuous-time Perspective for Modeling Acceleration in Riemannian Optimization
2019
Foivos Alimisis
Antonio Orvieto
Gary Bécigneul
Aurélien Lucchi
+
A Continuous-time Perspective for Modeling Acceleration in Riemannian Optimization.
2020
Foivos Alimisis
Antonio Orvieto
Gary Bécigneul
Aurélien Lucchi
+
Gradient Norm Minimization of Nesterov Acceleration: $o(1/k^3)$
2022
Shuo Chen
Bin Shi
Ya-xiang Yuan
+
From the Ravine method to the Nesterov method and vice versa: a dynamical system perspective
2022
H. Attouch
J. Fadili
+
A Dynamical Systems Perspective on Nesterov Acceleration
2019
Michael Muehlebach
Michael I. Jordan
Works That Cite This (463)
Action
Title
Year
Authors
+
PDF
Chat
Efficient first-order methods for convex minimization: a constructive approach
2019
Yoel Drori
Adrien Taylor
+
Lagrangian Penalization Scheme with Parallel Forward–Backward Splitting
2018
Cesare Molinari
Juan Peypouquet
+
PDF
Chat
Generalized affine scaling algorithms for linear programming problems
2019
Md Sarowar Morshed
Md. Noor‐E‐Alam
+
PDF
Chat
Continuous-Time Accelerated Methods via a Hybrid Control Lens
2019
Arman Sharifi Kolarijani
Peyman Mohajerin Esfahani
Tamás Keviczky
+
PDF
Chat
Asymptotic analysis of a structure-preserving integrator for damped Hamiltonian systems
2020
Adrian Viorel
Cristian Daniel Alecsa
Titus Pinţa
+
Inertial Newton Algorithms Avoiding Strict Saddle Points
2023
Camille Castera
+
Continuous-time Lower Bounds for Gradient-based Algorithms
2020
Michael Muehlebach
Michael I. Jordan
+
PDF
Chat
A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip
2021
Mathieu Even
Raphaël Berthier
Francis Bach
Nicolas Flammarion
Pierre Gaillard
Hadrien Hendrikx
Laurent Massoulié
Adrien Taylor
+
Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Non-Convex Optimization
2019
Mahesh Chandra Mukkamala
Peter Ochs
Thomas Pock
Shoham Sabach
+
FISTA: achieving a rate of convergence proportional to k<sup>-3</sup> for small/medium values of k
2019
Gustavo Silva
Paul Rodríguez
Works Cited by This (16)
Action
Title
Year
Authors
+
SLOPE—Adaptive variable selection via convex optimization
2015
Małgorzata Bogdan
E. van den Berg
Chiara Sabatti
Weijie Su
Emmanuel J. Candès
+
A smooth vector field for quadratic programming
2012
Hans-Bernd Dürr
Erkin Saka
Christian Ebenbauer
+
On a Class of Smooth Optimization Algorithms with Applications in Control
2012
Hans-Bernd Dürr
Christian Ebenbauer
+
Gradient methods for minimizing composite functions
2012
Yu. Nesterov
+
Optimization and Dynamical Systems
1994
Uwe Helmke
J.B. Moore
+
PDF
Chat
A dynamical systems approach to constrained minimization
2000
Johannes Schropp
Ivan Singer
+
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
2009
Amir Beck
Marc Teboulle
+
Approximation accuracy, gradient methods, and error bound for structured convex optimization
2010
Paul Tseng
+
PDF
Chat
Adaptive Restart for Accelerated Gradient Schemes
2013
Brendan O’Donoghue
Emmanuel J. Candès
+
PDF
Chat
Smooth minimization of non-smooth functions
2004
Yu. Nesterov