Randomized pick-freeze for sparse Sobol indices estimation in high dimension

Type: Article

Publication Date: 2015-01-01

Citations: 6

DOI: https://doi.org/10.1051/ps/2015013

Abstract

This article investigates selection of variables in high-dimension from a non-parametric regression model. In many concrete situations, we are concerned with estimating a non-parametric regression function f that may depend on a large number p of inputs variables. Unlike standard procedures, we do not assume that f belongs to a class of regular functions (Hölder, Sobolev, ...), yet we assume that f is a square-integrable function with respect to a known product measure. Furthermore, observe that, in some situations, only a small number s of the coordinates actually affects f in an additive manner. In this context, we prove that, with only 𝒪(slog p) random evaluations of f, one can find which are the relevant input variables with overwhelming probability. Our proposed method is an unconstrained ℓ1-minimization procedure based on the Sobol’s method. One step of this procedure relies on support recovery using ℓ1-minimization and thresholding. More precisely, we use a thresholded-LASSO to faithfully uncover the significant input variables. In this frame, we prove that one can relax the mutual incoherence property (known to require 𝒪(s2log p) observations) and still ensure faithful recovery from 𝒪(sαlog p) observations for any 1 ≤ α ≤ 2.

Locations

  • arXiv (Cornell University) - View - PDF
  • Springer Link (Chiba Institute of Technology) - View - PDF
  • French digital mathematics library (Numdam) - View - PDF
  • HAL (Le Centre pour la Communication Scientifique Directe) - View - PDF
  • ESAIM Probability and Statistics - View - PDF

Similar Works

Action Title Year Authors
+ Randomized pick-freeze for sparse Sobol indices estimation in high dimension 2014 Yohann de Castro
Alexandre Janon
+ The out-of-sample prediction error of the square-root-LASSO and related estimators 2022 José Luis Montiel Olea
Cynthia Rush
Amilcar Velez
Johannes Wiesel
+ PDF Chat A Brief Survey of Modern Optimization for Statisticians 2014 Kenneth Lange
C. Eric
Hua Zhou
+ Model selection with lasso-zero: adding straw to the haystack to better find needles 2018 Pascaline Descloux
Sylvain Sardy
+ Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles 2021 Pascaline Descloux
Sylvain Sardy
+ PDF Chat Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles 2021 Pascaline Descloux
Sylvain Sardy
+ Simultaneous support recovery in high dimensions: Benefits and perils of block $\ell_1/\ell_\infty$-regularization 2009 Sahand Negahban
Martin J. Wainwright
+ Numerical Characterization of Support Recovery in Sparse Regression with Correlated Design 2021 Ankit Kumar
Sharmodeep Bhattacharyya
Kristofer E. Bouchard
+ PDF Chat Sparse Recovery With Unknown Variance: A LASSO-Type Approach 2014 Stéphane Chrétien
Sébastien Darses
+ Efficient Smoothed Concomitant Lasso Estimation for High Dimensional Regression 2017 Eugène Ndiaye
Olivier Fercoq
Alexandre Gramfort
Vincent Leclère
Joseph Salmon
+ Joint and Post-Selection Confidence Sets for High-Dimensional Regression 2020 Kun Zhou
+ PDF Chat Sparse Regression: Scalable Algorithms and Empirical Performance 2020 Dimitris Bertsimas
Jean Pauphilet
Bart Van Parys
+ PDF Chat Discussion: The Dantzig selector: Statistical estimation when p is much larger than n 2007 Tommaso Cai
Jinchi Lv
+ Numerical characterization of support recovery in sparse regression with correlated design 2022 Ankit Kumar
Sharmodeep Bhattacharyya
Kristofer E. Bouchard
+ $χ^2$-confidence sets in high-dimensional regression 2015 Sara van de Geer
Benjamin Stucky
+ Subset Selection with Shrinkage: Sparse Linear Modeling when the SNR is low 2017 Rahul Mazumder
Peter Radchenko
Antoine Dedieu
+ PDF Chat Structured Regularizers for High-Dimensional Problems: Statistical and Computational Issues 2014 Martin J. Wainwright
+ The Smooth-Lasso and other $\ell_1+\ell_2$-penalized methods 2010 Mohamed Hebiri
Sara A. van de Geer
+ Nearly Optimal Sample Size in Hypothesis Testing for High-Dimensional Regression 2013 Adel Javanmard
Andrea Montanari
+ Nearly Optimal Sample Size in Hypothesis Testing for High-Dimensional Regression 2013 Adel Javanmard
Andrea Montanari