Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices

Type: Article

Publication Date: 2015-02-09

Citations: 274

DOI: https://doi.org/10.1109/tit.2015.2401574

Abstract

Recovering structured models (e.g., sparse or group-sparse vectors, low-rank matrices) given a few linear observations have been well-studied recently. In various applications in signal processing and machine learning, the model of interest is structured in several ways, for example, a matrix that is simultaneously sparse and low rank. Often norms that promote the individual structures are known, and allow for recovery using an order-wise optimal number of measurements (e.g., 11 norm for sparsity, nuclear norm for matrix rank). Hence, it is reasonable to minimize a combination of such norms. We show that, surprisingly, using multiobjective optimization with these norms can do no better, orderwise, than exploiting only one of the structures, thus revealing a fundamental limitation in sample complexity. This result suggests that to fully exploit the multiple structures, we need an entirely new convex relaxation. Further, specializing our results to the case of sparse and low-rank matrices, we show that a nonconvex formulation recovers the model from very few measurements (on the order of the degrees of freedom), whereas the convex problem combining the 11 and nuclear norms requires many more measurements, illustrating a gap between the performance of the convex and nonconvex recovery problems. Our framework applies to arbitrary structure-inducing norms as well as to a wide range of measurement ensembles. This allows us to give sample complexity bounds for problems such as sparse phase retrieval and low-rank tensor completion.

Locations

  • arXiv (Cornell University) - View - PDF
  • CiteSeer X (The Pennsylvania State University) - View - PDF
  • CaltechAUTHORS (California Institute of Technology) - View - PDF
  • IEEE Transactions on Information Theory - View

Similar Works

Action Title Year Authors
+ Simultaneously Structured Models with Application to Sparse and Low-rank Matrices 2012 Samet Oymak
Amin Jalali
Maryam Fazel
Yonina C. Eldar
Babak Hassibi
+ Simultaneously Structured Models with Application to Sparse and Low-rank Matrices 2012 Samet Oymak
Amin Jalali
Maryam Fazel
Yonina C. Eldar
Babak Hassibi
+ Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery 2013 Cun Mu
Bo Huang
John Wright
Donald Goldfarb
+ Efficient Structured Matrix Rank Minimization 2014 Adams Wei Yu
Wanli Ma
Yaoliang Yu
Jaime Carbonell
Suvrit Sra
+ Efficient Structured Matrix Rank Minimization 2015 Adams Wei Yu
Wanli Ma
Yaoliang Yu
Jaime G. Carbonell
Suvrit Sra
+ PDF Chat Noisy estimation of simultaneously structured models: Limitations of convex relaxation 2013 Samet Oymak
Amin Jalali
Maryam Fazel
Babak Hassibi
+ Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery 2014 Cun Mu
Bo Huang
John Wright
Donald Goldfarb
+ Harnessing Structures in Big Data via Guaranteed Low-Rank Matrix Estimation 2018 Yudong Chen
Yuejie Chi
+ Exploiting the structure effectively and efficiently in low rank matrix recovery 2018 Jian‐Feng Cai
Ke Wei
+ Recovering Structured Low-rank Operators Using Nuclear Norms 2017 John Jacob Bruer
+ PDF Chat Simultaneous Structures in Convex Signal Recovery—Revisiting the Convex Combination of Norms 2019 Martin Kliesch
Stanisław J. Szarek
Peter Jung
+ STARK: Structured Dictionary Learning Through Rank-one Tensor Recovery 2017 Mohsen Ghassemi
Zahra Shakeri
Anand D. Sarwate
Waheed U. Bajwa
+ Accurate low-rank matrix recovery from a small number of linear measurements 2009 Emmanuel J. Candès
Yaniv Plan
+ PDF Chat STARK: Structured dictionary learning through rank-one tensor recovery 2017 Mohsen Ghassemi
Zahra Shakeri
Anand D. Sarwate
Waheed U. Bajwa
+ STARK: Structured Dictionary Learning Through Rank-one Tensor Recovery 2017 Mohsen Ghassemi
Zahra Shakeri
Anand D. Sarwate
Waheed U. Bajwa
+ PDF Chat Rank-Sparsity Incoherence for Matrix Decomposition 2011 Venkat Chandrasekaran
Sujay Sanghavi
Pablo A. Parrilo
Alan S. Willsky
+ Rank-Sparsity Incoherence for Matrix Decomposition 2011 Venkat Chandrasekaran
Sujay Sanghavi
Pablo A. Parrilo
Alan S. Willsky
+ Recovering Simultaneously Structured Data via Non-Convex Iteratively Reweighted Least Squares 2023 Christian Kümmerle
Johannes Maly
+ Convex Tensor Decomposition via Structured Schatten Norm Regularization 2013 Ryota Tomioka
Taiji Suzuki
+ Convex Tensor Decomposition via Structured Schatten Norm Regularization 2013 Ryota Tomioka
Taiji Suzuki