Approximately low-rank recovery from noisy and local measurements by convex program

Type: Article

Publication Date: 2023-04-27

Citations: 0

DOI: https://doi.org/10.1093/imaiai/iaad013

Abstract

Abstract Low-rank matrix models have been universally useful for numerous applications, from classical system identification to more modern matrix completion in signal processing and statistics. The nuclear norm has been employed as a convex surrogate of the low-rankness since it induces a low-rank solution to inverse problems. While the nuclear norm for low rankness has an excellent analogy with the $\ell _1$ norm for sparsity through the singular value decomposition, other matrix norms also induce low-rankness. Particularly as one interprets a matrix as a linear operator between Banach spaces, various tensor product norms generalize the role of the nuclear norm. We provide a tensor-norm-constrained estimator for the recovery of approximately low-rank matrices from local measurements corrupted with noise. A tensor-norm regularizer is designed to adapt to the local structure. We derive statistical analysis of the estimator over matrix completion and decentralized sketching by applying Maurey’s empirical method to tensor products of Banach spaces. The estimator provides a near-optimal error bound in a minimax sense and admits a polynomial-time algorithm for these applications.

Locations

  • Information and Inference A Journal of the IMA - View
  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ Approximately low-rank recovery from noisy and local measurements by convex program 2021 Kiryung Lee
Rakshith Sharma Srinivasa
Marius Junge
Justin Romberg
+ Robust Matrix Completion 2016 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ Robust matrix completion 2016 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ Accurate low-rank matrix recovery from a small number of linear measurements 2009 Emmanuel J. Candès
Yaniv Plan
+ Near-optimal sample complexity for convex tensor completion 2017 Navid Ghadermarzy
Yaniv Plan
Özgür Yılmaz
+ Near-optimal sample complexity for convex tensor completion 2017 Navid Ghadermarzy
Yaniv Plan
Özgür Yılmaz
+ Robust Matrix Completion 2014 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ PDF Chat Robust Matrix Completion 2017 Olga Klopp
Karim Lounici
Alexandre B. Tsybakov
+ Robust Low-Tubal-Rank Tensor Completion via Convex Optimization 2019 Qiang Jiang
Michael K. Ng
+ Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery 2013 Cun Mu
Bo Huang
John Wright
Donald Goldfarb
+ PDF Chat Near-optimal sample complexity for convex tensor completion 2018 Navid Ghadermarzy
Yaniv Plan
Özgür Yılmaz
+ Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization 2020 Yuxin Chen
Yuejie Chi
Jianqing Fan
Cong Ma
Yuling Yan
+ Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization 2019 Yuxin Chen
Yuejie Chi
Jianqing Fan
Cong Ma
Yuling Yan
+ Harnessing Structures in Big Data via Guaranteed Low-Rank Matrix Estimation 2018 Yudong Chen
Yuejie Chi
+ Statistically Optimal and Computationally Efficient Low Rank Tensor Completion from Noisy Entries 2017 Dong Xia
Ming Yuan
Cun‐Hui Zhang
+ PDF Chat Statistically optimal and computationally efficient low rank tensor completion from noisy entries 2021 Dong Xia
Ming Yuan
Cun‐Hui Zhang
+ PDF Chat Exact matrix completion via convex optimization 2012 Emmanuel J. Candès
Benjamin Recht
+ Sketching low-rank matrices with a shared column space by convex programming 2022 Rakshith S. Srinivasa
Seonho Kim
Kiryung Lee
+ PDF Chat Sketching Low-Rank Matrices With a Shared Column Space by Convex Programming 2023 Rakshith S. Srinivasa
Seonho Kim
Kiryung Lee
+ Weighted Matrix Completion and Recovery with Prior Subspace Information 2016 Armin Eftekhari
Dehui Yang
Michael B. Wakin

Works That Cite This (0)

Action Title Year Authors