Coordinate descent with arbitrary sampling I: algorithms and complexity<sup>†</sup>

Type: Article

Publication Date: 2016-07-05

Citations: 80

DOI: https://doi.org/10.1080/10556788.2016.1190360

Abstract

We study the problem of minimizing the sum of a smooth convex function and a convex block-separable regularizer and propose a new randomized coordinate descent method, which we call ALPHA. Our method at every iteration updates a random subset of coordinates, following an arbitrary distribution. No coordinate descent methods capable to handle an arbitrary sampling have been studied in the literature before for this problem. ALPHA is a very flexible algorithm: in special cases, it reduces to deterministic and randomized methods such as gradient descent, coordinate descent, parallel coordinate descent and distributed coordinate descent—both in nonaccelerated and accelerated variants. The variants with arbitrary (or importance) sampling are new. We provide a complexity analysis of ALPHA, from which we deduce as a direct corollary complexity bounds for its many variants, all matching or improving best known bounds.

Locations

  • Optimization methods & software - View - PDF
  • arXiv (Cornell University) - View - PDF
  • Edinburgh Research Explorer (University of Edinburgh) - View - PDF

Similar Works

Action Title Year Authors
+ Coordinate Descent with Arbitrary Sampling I: Algorithms and Complexity 2014 Zheng Qu
Peter Richtárik
+ PDF Chat Importance sampling strategy for non-convex randomized block-coordinate descent 2015 Rémi Flamary
Alain Rakotomamonjy
Gilles Gasso
+ Randomized Dual Coordinate Ascent with Arbitrary Sampling 2014 Zheng Qu
Peter Richtárik
Tong Zhang
+ Iteration Complexity of Randomized Block-Coordinate Descent Methods for Minimizing a Composite Function ∗ 2011 Peter Richtárik
Martin Takáč
+ Iteration Complexity of Randomized Block-Coordinate Descent Methods for Minimizing a Composite Function 2011 Peter Richtárik
Martin Takáč
+ Accelerated, Parallel, and Proximal Coordinate Descent 2015 Olivier Fercoq
Peter Richtárik
+ Coordinate Descent with Arbitrary Sampling II: Expected Separable Overapproximation 2014 Zheng Qu
Peter Richtárik
+ PDF Chat Coordinate descent with arbitrary sampling II: expected separable overapproximation 2016 Zheng Qu
Peter Richtárik
+ Randomized Block Coordinate Descent for Online and Stochastic Optimization 2014 Huahua Wang
Arindam Banerjee
+ Efficient Greedy Coordinate Descent for Composite Problems 2018 Sai Praneeth Karimireddy
Anastasia Koloskova
Sebastian U. Stich
Martin Jaggi
+ An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization 2015 Qihang Lin
Zhaosong Lu
Lin Xiao
+ Large-scale randomized-coordinate descent methods with non-separable linear constraints 2014 Sashank J. Reddi
Ahmed Hefny
Carlton Downey
Avinava Dubey
Suvrit Sra
+ Large-scale randomized-coordinate descent methods with non-separable linear constraints 2014 Sashank J. Reddi
Ahmed Hefny
Carlton Downey
Avinava Dubey
Suvrit Sra
+ PDF Chat Inexact Variable Metric Stochastic Block-Coordinate Descent for Regularized Optimization 2020 Ching-pei Lee
Stephen J. Wright
+ Coordinate Descent with Arbitrary Sampling II: Expected 2017 Separable Overapproximation
+ An Accelerated Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization 2014 Qihang Lin
Zhaosong Lu
Lin Xiao
+ Unifying Framework for Accelerated Randomized Methods in Convex Optimization 2023 Pavel Dvurechensky
Alexander Gasnikov
Alexander Tyurin
Vladimir Zholobov
+ Unifying Framework for Accelerated Randomized Methods in Convex Optimization 2017 Pavel Dvurechensky
Alexander Gasnikov
Alexander Tiurin
Vladimir Zholobov
+ Inexact Variable Metric Stochastic Block-Coordinate Descent for Regularized Optimization 2018 Ching-pei Lee
Stephen J. Wright
+ PDF Chat The Randomized Block Coordinate Descent Method in the H\"older Smooth Setting 2024 Leandro Farias Maia
David H. Gutman