Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements

Type: Article

Publication Date: 2006-03-01

Citations: 235

DOI: https://doi.org/10.1109/ciss.2006.286463

Download PDF

Abstract

This paper proves best known guarantees for exact reconstruction of a sparse signal f from few non-adaptive universal linear measurements. We consider Fourier measurements (random sample of frequencies of f) and random Gaussian measurements. The method for reconstruction that has recently gained momentum in the sparse approximation theory is to relax this highly non-convex problem to a convex problem, and then solve it as a linear program. What are best guarantees for the reconstruction problem to be equivalent to its convex relaxation is an open question. Recent work shows that the number of measurements k(r,n) needed to exactly reconstruct any r-sparse signal f of length n from its linear measurements with convex relaxation is usually O(r poly log (n)). However, known guarantees involve huge constants, in spite of very good performance of the algorithms in practice. In attempt to reconcile theory with practice, we prove the first guarantees for universal measurements (i.e. which work for all sparse functions) with reasonable constants. For Gaussian measurements, k(r,n) lsim 11.7 r [1.5 + log(n/r)], which is optimal up to constants. For Fourier measurements, we prove the best known bound k(r, n) = O(r log(n) middot log <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> (r) log(r log n)), which is optimal within the log log n and log <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> r factors. Our arguments are based on the technique of geometric functional analysis and probability in Banach spaces.

Locations

  • arXiv (Cornell University) - View - PDF

Similar Works

Action Title Year Authors
+ Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements 2006 Mark Rudelson
Roman Vershynin
+ Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements 2006 Mark Rudelson
Roman Vershynin
+ On sparse reconstruction from Fourier and Gaussian measurements 2007 Mark Rudelson
Roman Vershynin
+ On Exact and Robust Recovery for Plug-and-Play Compressed Sensing 2022 Ruturaj G. Gavaskar
Chirayu D. Athalye
Kunal N. Chaudhury
+ Modern Problems in Mathematical Signal Processing: Quantized Compressed Sensing and Randomized Neural Networks 2019 Aaron A. Nelson
+ Guaranteed Sparse Recovery under Linear Transformation 2013 Ji Liu
Lei Yuan
Jieping Ye
+ Almost Optimal Unrestricted Fast Johnson-Lindenstrauss Transform 2010 Nir Ailon
Edo Liberty
+ Convex Sparse Blind Deconvolution 2021 Qingyun Sun
David L. Donoho
+ Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing 2019 Ben Adcock
Simone Brugiapaglia
Matthew King–Roskamp
+ Exact Reconstruction of Sparse Signals via Nonconvex Minimization 2007 Rick Chartrand
+ Near Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? 2004 Emmanuel J. Candès
Terence Tao
+ PDF Chat Simple bounds for recovering low-complexity models 2012 Emmanuel J. Candès
Benjamin Recht
+ Sparse recovery using the preservability of the null space property under random measurements. 2017 Peter G. Casazza
Xuemei Chen
Richard G. Lynch
+ Universality in Learning from Linear Measurements 2019 Ehsan Abbasi
Fariborz Salehi
Babak Hassibi
+ Simple Bounds for Recovering Low-complexity Models 2011 Emmanuel J. Candès
Benjamin Recht
+ Simple Bounds for Recovering Low-complexity Models 2011 Emmanuel J. Candès
Benjamin Recht
+ Shannon Theoretic Limits on Noisy Compressive Sampling 2007 Mehmet Akçakaya
Vahid Tarokh
+ A Unified Recovery of Structured Signals Using Atomic Norm 2022 Xuemei Chen
+ PDF Chat An Almost Optimal Unrestricted Fast Johnson-Lindenstrauss Transform 2013 Nir Ailon
Edo Liberty
+ Practical Signal Recovery from Random Projections 2005 Justin Romberg