The Power of Convex Relaxation: Near-Optimal Matrix Completion

Type: Article

Publication Date: 2010-04-26

Citations: 2141

DOI: https://doi.org/10.1109/tit.2010.2044061

Abstract

This paper is concerned with the problem of recovering an unknown matrix from a small fraction of its entries. This is known as the <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">matrix completion</i> problem, and comes up in a great number of applications, including the famous <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">Netflix Prize</i> and other similar questions in collaborative filtering. In general, accurate recovery of a matrix from a small number of entries is impossible, but the knowledge that the unknown matrix has low rank radically changes this premise, making the search for solutions meaningful. This paper presents optimality results quantifying the minimum number of entries needed to recover a matrix of rank <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">r</i> exactly by any method whatsoever (the information theoretic limit). More importantly, the paper shows that, under certain incoherence assumptions on the singular vectors of the matrix, recovery is possible by solving a convenient convex program as soon as the number of entries is on the order of the information theoretic limit (up to logarithmic factors). This convex program simply finds, among all matrices consistent with the observed entries, that with minimum nuclear norm. As an example, we show that on the order of <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">nr</i> log( <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</i> ) samples are needed to recover a random <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</i> x <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</i> matrix of rank <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">r</i> by any method, and to be sure, nuclear norm minimization succeeds as soon as the number of entries is of the form <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">nr</i> polylog( <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</i> ).

Locations

  • IEEE Transactions on Information Theory - View
  • CaltechAUTHORS (California Institute of Technology) - View - PDF
  • arXiv (Cornell University) - View - PDF
  • IEEE Transactions on Information Theory - View
  • CaltechAUTHORS (California Institute of Technology) - View - PDF
  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ The Power of Convex Relaxation: Near-Optimal Matrix Completion 2009 Emmanuel J. Candès
Terence Tao
+ Exact Matrix Completion via Convex Optimization 2008 Emmanuel J. Candès
Benjamin Recht
+ PDF Exact matrix completion via convex optimization 2012 Emmanuel J. Candès
Benjamin Recht
+ PDF Chat Matrix Completion With Column Manipulation: Near-Optimal Sample-Robustness-Rank Tradeoffs 2015 Yudong Chen
Huan Xu
Constantine Caramanis
Sujay Sanghavi
+ Matrix Completion and Related Problems via Strong Duality 2017 Maria-Florina Balcan
Yingyu Liang
David P. Woodruff
Hongyang Zhang
+ Matrix completion with column manipulation: Near-optimal sample-robustness-rank tradeoffs 2011 Yudong Chen
Huan Xu
Constantine Caramanis
Sujay Sanghavi
+ Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization 2019 Yuxin Chen
Yuejie Chi
Jianqing Fan
Cong Ma
Yuling Yan
+ Robust matrix completion 2016 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization 2020 Yuxin Chen
Yuejie Chi
Jianqing Fan
Cong Ma
Yuling Yan
+ Large-Scale Convex Minimization with a Low-Rank Constraint 2011 Shai Shalev‐Shwartz
Alon Gonen
Ohad Shamir
+ Robust Matrix Completion 2016 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ PDF Chat A Non-convex Relaxation for Fixed-Rank Approximation 2017 Carl Olsson
Marcus Carlsson
Erik Bylow
+ PDF Chat Incoherence-Optimal Matrix Completion 2015 Yudong Chen
+ A Non-Convex Relaxation for Fixed-Rank Approximation 2017 Carl Olsson
Marcus Carlsson
Erik Bylow
+ A Non-Convex Relaxation for Fixed-Rank Approximation 2017 Carl A. Olsson
Marcus Carlsson
Erik Bylow
+ Interpretable Matrix Completion: A Discrete Optimization Approach 2018 Dimitris Bertsimas
Michael Lingzhi Li
+ A Scalable, Adaptive and Sound Nonconvex Regularizer for Low-rank Matrix Completion 2020 Quanming Yao
Yaqing Wang
James T. Kwok
+ Noisy matrix decomposition via convex relaxation: Optimal rates in high dimensions 2012 Alekh Agarwal
Sahand Negahban
Martin J. Wainwright
+ Matrix Completion via Nonconvex Factorization: Algorithms and Theory 2015 Ruoyu Sun
+ Low Permutation-rank Matrices: Structural Properties and Noisy Completion 2019 Nihar B. Shah
Sivaraman Balakrishnan
Martin J. Wainwright