A sub-sampled tensor method for nonconvex optimization

Type: Article

Publication Date: 2022-08-24

Citations: 1

DOI: https://doi.org/10.1093/imanum/drac057

Abstract

Abstract A significant theoretical advantage of high-order optimization methods is their superior convergence guarantees. For instance, third-order regularized methods reach an $(\epsilon _1,\epsilon _2,\epsilon _3)$third-order critical point in at most ${\mathcal {O}} (\max (\epsilon _1^{-4/3}, \epsilon _2^{-2}, \epsilon _3^{-4} ) )$ iterations. However, the cost of computing high-order derivatives is prohibitively expensive in real applications, including for instance many real-world machine learning tasks. In order to address this problem, we present a sub-sampled optimization method that uses a third-order regularized model to find local minima of smooth and potentially nonconvex objective functions with a finite-sum structure. This algorithm uses sub-sampled derivatives instead of exact quantities and is guaranteed to converge to a third-order critical point. Our analysis relies on a novel tensor concentration inequality for sums of tensors of any order that makes explicit use of the finite-sum structure of the objective function.

Locations

  • IMA Journal of Numerical Analysis - View
  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ A Sub-sampled Tensor Method for Non-convex Optimization 2019 Aurélien Lucchi
Jonas Köhler
+ A Stochastic Tensor Method for Non-convex Optimization 2019 Aurélien Lucchi
Jonas Köhler
+ PDF Chat Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives 2020 Geovani Nunes Grapiglia
Yu. Nesterov
+ The global rate of convergence for optimal tensor methods in smooth convex optimization 2018 Alexander Gasnikov
Pavel Dvurechensky
Eduard Gorbunov
Evgeniya Vorontsova
Daniil Selikhanovych
César A. Uribe
+ The global rate of convergence for optimal tensor methods in smooth convex optimization 2018 Alexander Gasnikov
Pavel Dvurechensky
Eduard Gorbunov
Evgeniya Vorontsova
Daniil Selikhanovych
César A. Uribe
+ PDF Chat Optimal Tensor Methods in Smooth Convex and Uniformly Convex Optimization 2019 Alexander Gasnikov
Pavel Dvurechensky
Eduard Gorbunov
Evgeniya Vorontsova
Daniil Selikhanovych
César A. Uribe
+ Higher-order tensor methods for minimizing difference of convex functions 2024 Ion Necoara
+ Sub-sampled Cubic Regularization for Non-convex Optimization 2017 Jonas Köhler
Aurélien Lucchi
+ Sub-sampled Cubic Regularization for Non-convex Optimization 2017 Jonas Köhler
Aurélien Lucchi
+ Sub-sampled Cubic Regularization for Non-convex Optimization 2017 Jonas Köhler
Aurélien Lucchi
+ Cubic-quartic regularization models for solving polynomial subproblems in third-order tensor methods 2023 Wenqi Zhu
Coralia Cartis
+ Second-order methods for quartically-regularised cubic polynomials, with applications to high-order tensor methods 2023 Coralia Cartis
Wenqi Zhu
+ A Second Order Method for Nonconvex Optimization 2017 Santiago Paternain
Aryan Mokhtari
Ribeiro Alejandro
+ Tensor Methods for Minimizing Functions with H\"{o}lder Continuous Higher-Order Derivatives 2019 Geovani Nunes Grapiglia
Yurii Nesterov
+ Subgradient methods for convex minimization 2002 Angelia NediÄ
+ A family of second-order methods for convex $$\ell _1$$-regularized optimization 2015 Richard H. Byrd
Gillian M. Chin
Jorge Nocedal
Figen Öztoprak
+ PDF Chat OPTAMI: Global Superlinear Convergence of High-order Methods 2024 Dmitry Kamzolov
Dmitry Pasechnyuk
Artem Agafonov
Alexander Gasnikov
Martin Takáč
+ A Subsampling Line-Search Method with Second-Order Results 2018 El-houcine Bergou
Youssef Diouane
Vladimír Kunc
Vyacheslav Kungurtsev
Clément W. Royer
+ Inexact Tensor Methods with Dynamic Accuracies 2020 Nikita Doikov
Yurii Nesterov
+ Inexact Tensor Methods with Dynamic Accuracies 2020 Nikita Doikov
Yurii Nesterov