Accelerated Optimization with Orthogonality Constraints

Type: Article

Publication Date: 2020-11-04

Citations: 16

DOI: https://doi.org/10.4208/jcm.1911-m2018-0242

Abstract

We develop a generalization of Nesterov's accelerated gradient descent method which is designed to deal with orthogonality constraints. To demonstrate the effectiveness of our method, we perform numerical experiments which demonstrate that the number of iterations scales with the square root of the condition number, and also compare with existing state-of-the-art quasi-Newton methods on the Stiefel manifold. Our experiments show that our method outperforms existing state-of-the-art quasi-Newton methods on some large, ill-conditioned problems.

Locations

  • Journal of Computational Mathematics - View
  • arXiv (Cornell University) - View - PDF
  • DataCite API - View

Similar Works

Action Title Year Authors
+ Structured Quasi-Newton Methods for Optimization with Orthogonality Constraints 2018 Jiang Hu
Bo Jiang
Lin Lin
Zaiwen Wen
Ya-xiang Yuan
+ A Projection Method for Optimization Problems on the Stiefel Manifold 2017 Oscar Dalmau
Harry Oviedo
+ Multipliers Correction Methods for Optimization Problems over the Stiefel Manifold 2020 Lei Wang
Bin Gao
Xin Liu
+ Multipliers Correction Methods for Optimization Problems over the Stiefel Manifold 2021 Lei Wang
Bin Gao
Xin Liu
+ A Non-monotone Linear Search Method with Mixed Direction on Stiefel Manifold 2017 Harry Oviedo
Hugo Lara
Oscar Dalmau
+ A Non-monotone Linear Search Method with Mixed Direction on Stiefel Manifold 2017 Harry Oviedo
Hugo Lara
Oscar Dalmau
+ On the transient growth of Nesterov’s accelerated method for strongly convex optimization problems 2020 Samantha Samuelson
Hesameddin Mohammadi
Mihailo R. Jovanović
+ PDF Chat Objective acceleration for unconstrained optimization 2018 Asbjørn Nilsen Riseth
+ Online Regularized Nonlinear Acceleration 2018 Damien Scieur
Edouard Oyallon
Alexandre d’Aspremont
Francis Bach
+ Online Regularized Nonlinear Acceleration 2018 Damien Scieur
Edouard Oyallon
Alexandre d’Aspremont
Francis Bach
+ An accelerated minimal gradient method with momentum for strictly convex quadratic optimization 2021 Harry Oviedo
Oscar Dalmau
Rafael Herrera
+ A non-monotone linear search algorithm with mixed direction on Stiefel manifold 2018 Harry Oviedo
Hugo Lara
Oscar Dalmau
+ On Adapting Nesterov's Scheme to Accelerate Iterative Methods for Linear Problems 2021 Tao Hong
Irad Yavneh
+ Action constrained quasi-Newton methods 2014 Robert M. Gower
Jacek Gondzio
+ A Scaled Gradient Projection Method for Minimization over the Stiefel Manifold 2019 Harry Oviedo
Oscar Dalmau
+ A Fast Anderson-Chebyshev Acceleration for Nonlinear Optimization 2018 Zhize Li
Jian Li
+ PDF Chat Acceleration Methods 2021 Alexandre d’Aspremont
Damien Scieur
Adrien Taylor
+ PDF Chat Structured Quasi-Newton Methods for Optimization with Orthogonality Constraints 2019 Jiang Hu
Bo Jiang
Lin Lin
Zaiwen Wen
Ya-xiang Yuan
+ Accelerated conjugate direction methods for unconstrained optimization 1978 Melanie L. Lenard
+ PDF Chat On adapting Nesterov's scheme to accelerate iterative methods for linear problems 2021 Tao Hong
Irad Yavneh