Matrix Completion With Column Manipulation: Near-Optimal Sample-Robustness-Rank Tradeoffs

Type: Article

Publication Date: 2015-11-10

Citations: 20

DOI: https://doi.org/10.1109/tit.2015.2499247

Abstract

This paper considers the problem of matrix completion when some number of the columns are completely and arbitrarily corrupted, potentially by a malicious adversary. It is well known that standard algorithms for matrix completion can return arbitrarily poor results, if even a single column is corrupted. One direct application comes from robust collaborative filtering. Here, some number of users are so-called manipulators who try to skew the predictions of the algorithm by calibrating their inputs to the system. In this paper, we develop an efficient algorithm for this problem based on a combination of a trimming procedure and a convex program that minimizes the nuclear norm and the ℓ <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1,2</sub> norm. Our theoretical results show that given a vanishing fraction of observed entries, it is nevertheless possible to complete the underlying matrix even when the number of corrupted columns grows. Significantly, our results hold without any assumptions on the locations or values of the observed entries of the manipulated columns. Moreover, we show by an information-theoretic argument that our guarantees are nearly optimal in terms of the fraction of sampled entries on the authentic columns, the fraction of corrupted columns, and the rank of the underlying matrix. Our results therefore sharply characterize the tradeoffs between sample, robustness, and rank in matrix completion.

Locations

  • arXiv (Cornell University) - View - PDF
  • IEEE Transactions on Information Theory - View

Similar Works

Action Title Year Authors
+ Matrix completion with column manipulation: Near-optimal sample-robustness-rank tradeoffs 2011 Yudong Chen
Huan Xu
Constantine Caramanis
Sujay Sanghavi
+ Robust Matrix Completion with Corrupted Columns 2011 Yudong Chen
Huan Xu
Constantine Caramanis
Sujay Sanghavi
+ The Power of Convex Relaxation: Near-Optimal Matrix Completion 2009 Emmanuel J. Candès
Terence Tao
+ PDF Chat Matrix Completion With Noise 2010 Emmanuel J. Candès
Yaniv Plan
+ Matrix Completion With Noise 2009 Emmanuel J. Candès
Yaniv Plan
+ Robust matrix completion 2016 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ Robust Matrix Completion 2016 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ PDF Chat The Power of Convex Relaxation: Near-Optimal Matrix Completion 2010 Emmanuel J. Candès
Terence Tao
+ Exact Matrix Completion via Convex Optimization 2008 Emmanuel J. Candès
Benjamin Recht
+ Robust Matrix Completion 2014 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ PDF Chat Robust Matrix Completion 2017 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ PDF Chat Robust Matrix Completion with Heavy-tailed Noise 2024 Bingyan Wang
Jianqing Fan
+ Low Permutation-rank Matrices: Structural Properties and Noisy Completion 2019 Nihar B. Shah
Sivaraman Balakrishnan
Martin J. Wainwright
+ Low Permutation-rank Matrices: Structural Properties and Noisy Completion 2017 Nihar B. Shah
Sivaraman Balakrishnan
Martin J. Wainwright
+ Low Permutation-rank Matrices: Structural Properties and Noisy Completion 2017 Nihar B. Shah
Sivaraman Balakrishnan
Martin J. Wainwright
+ Nearly-optimal Robust Matrix Completion 2016 Yeshwanth Cherapanamjeri
Kartik Gupta
Prateek Jain
+ Nearly-optimal Robust Matrix Completion 2016 Yeshwanth Cherapanamjeri
Kartik Gupta
Prateek Jain
+ Completing Any Low-rank Matrix, Provably 2013 Yudong Chen
Srinadh Bhojanapalli
Sujay Sanghavi
Rachel Ward
+ Completing Any Low-rank Matrix, Provably 2013 Yudong Chen
Srinadh Bhojanapalli
Sujay Sanghavi
Rachel Ward
+ PDF Chat Robust Matrix Completion with Deterministic Sampling via Convex Optimization 2024 Yinjian Wang