Projects
Reading
People
Chat
SU\G
(𝔸)
/K·U
Projects
Reading
People
Chat
Sign Up
Light
Dark
System
Minimum 𝓁 1 -norm interpolation via basis pursuit is robust to errors.
Geoffrey Chinot
,
Matthias Löffler
,
Sara A. van de Geer
Type:
Preprint
Publication Date:
2020-12-01
Citations:
6
View Publication
Share
Locations
arXiv (Cornell University) -
View
Similar Works
Action
Title
Year
Authors
+
Tight bounds for minimum l1-norm interpolation of noisy data
2021
Guillaume Wang
Konstantin Donhauser
Fanny Yang
+
Tight bounds for minimum l1-norm interpolation of noisy data.
2021
Guillaume Wang
Konstantin Donhauser
Fanny Yang
+
Overfitting Can Be Harmless for Basis Pursuit, But Only to a Degree
2020
Peizhong Ju
Xiaojun Lin
Jia Liu
+
Least Sparsity of $p$-norm based Optimization Problems with $p > 1$
2017
Jinglai Shen
Seyedahmad Mousavi
+
PDF
Chat
Least Sparsity of $p$-Norm Based Optimization Problems with $p>1$
2018
Jinglai Shen
Seyedahmad Mousavi
+
A First-order Augmented Lagrangian Method for Compressed Sensing
2010
Necdet Serhat Aybat
Garud Iyengar
+
Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate.
2021
Christian Kümmerle
Claudio Mayrink Verdun
Dominik Stöger
+
PDF
Chat
A First-Order Augmented Lagrangian Method for Compressed Sensing
2012
Necdet Serhat Aybat
Garud Iyengar
+
A First-order Augmented Lagrangian Method for Compressed Sensing
2010
Necdet Serhat Aybat
Garud Iyengar
+
Foolish Crowds Support Benign Overfitting
2021
Niladri S. Chatterji
Philip M. Long
+
Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate
2021
Christian Kümmerle
Claudio Mayrink Verdun
Dominik Stöger
+
Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate
2020
Christian Kümmerle
Claudio Mayrink Verdun
Dominik Stöger
+
Foolish Crowds Support Benign Overfitting.
2021
Niladri S. Chatterji
Philip M. Long
+
PDF
Chat
Accuracy Guarantees for <formula formulatype="inline"> <tex Notation="TeX">$\ell_1$</tex></formula>-Recovery
2011
Anatoli Juditsky
Arkadi Nemirovski
+
Uniform Convergence of Interpolators: Gaussian Width, Norm Bounds, and Benign Overfitting
2021
Frederic Koehler
Lijia Zhou
Danica J. Sutherland
Nathan Srebro
+
Uniform Convergence of Interpolators: Gaussian Width, Norm Bounds, and Benign Overfitting
2021
Frederic Koehler
Lijia Zhou
Danica J. Sutherland
Nathan Srebro
+
Sharp Convergence Rates for Matching Pursuit
2023
Jason M. Klusowski
Jonathan W. Siegel
+
Robust non-linear regression analysis: A greedy approach employing kernels and application to image denoising.
2016
Georgios K. Papageorgiou
Pantelis Bouboulis
Sergios Theodoridis
+
Fast, Provably convergent IRLS Algorithm for p-norm Linear Regression
2019
Deeksha Adil
Richard Peng
Sushant Sachdeva
+
Beyond $\ell_1$-norm minimization for sparse signal recovery
2012
Hassan Mansour
Works That Cite This (6)
Action
Title
Year
Authors
+
Tight bounds for minimum l1-norm interpolation of noisy data.
2021
Guillaume Wang
Konstantin Donhauser
Fanny Yang
+
Foolish Crowds Support Benign Overfitting.
2021
Niladri S. Chatterji
Philip M. Long
+
A Farewell to the Bias-Variance Tradeoff? An Overview of the Theory of Overparameterized Machine Learning
2021
Yehuda Dar
Vidya Muthukumar
Richard G. Baraniuk
+
The Interplay Between Implicit Bias and Benign Overfitting in Two-Layer Linear Networks
2021
Niladri S. Chatterji
Philip M. Long
Peter L. Bartlett
+
Minimum $\ell_{1}$-norm interpolators: Precise asymptotics and multiple descent
2021
Yue Li
Yuting Wei
+
Minimum complexity interpolation in random features models.
2021
Michael Celentano
Theodor Misiakiewicz
Andrea Montanari
Works Cited by This (0)
Action
Title
Year
Authors