On Selection Criteria for the Tuning Parameter in Robust Divergence

Type: Article

Publication Date: 2021-09-01

Citations: 13

DOI: https://doi.org/10.3390/e23091147

Abstract

While robust divergence such as density power divergence and $\gamma$-divergence is helpful for robust statistical inference in the presence of outliers, the tuning parameter that controls the degree of robustness is chosen in a rule-of-thumb, which may lead to an inefficient inference. We here propose a selection criterion based on an asymptotic approximation of the Hyvarinen score applied to an unnormalized model defined by robust divergence. The proposed selection criterion only requires first and second-order partial derivatives of an assumed density function with respect to observations, which can be easily computed regardless of the number of parameters. We demonstrate the usefulness of the proposed method via numerical studies using normal distributions and regularized linear regression.

Locations

  • Entropy - View - PDF
  • PubMed Central - View
  • arXiv (Cornell University) - View - PDF
  • Europe PMC (PubMed Central) - View - PDF
  • DOAJ (DOAJ: Directory of Open Access Journals) - View
  • PubMed - View
  • DataCite API - View

Similar Works

Action Title Year Authors
+ PDF Chat Adaptation of the Tuning Parameter in General Bayesian Inference with Robust Divergence 2022 Shouto Yonekura
Shonosuke Sugasawa
+ Adaptation of the Tuning Parameter in General Bayesian Inference with Robust Divergence 2021 Shouto Yonekura
Shonosuke Sugasawa
+ PDF Chat Adaptation of the tuning parameter in general Bayesian inference with robust divergence 2023 Shouto Yonekura
Shonosuke Sugasawa
+ Minimizing robust density power-based divergences for general parametric density models 2023 Akifumi Okuno
+ PDF Chat Robust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approach 2015 Abhik Ghosh
Ayanendranath Basu
+ Asymptotic Breakdown Point Analysis for a General Class of Minimum Divergence Estimators 2023 Subhrajyoty Roy
Abir De Sarkar
Abhik Ghosh
Ayanendranath Basu
+ On the Minimum $\mathcal {K}$-Divergence Estimator 2022 Yair Sorek
Koby Todros
+ PDF Chat Model Selection in a Composite Likelihood Framework Based on Density Power Divergence 2020 Elena Castilla
Nirian Martín
Leandro Pardo
K. Zografos
+ Restricted distance-type Gaussian estimators based on density power divergence and their applications in hypothesis testing 2023 Ángel Felipe
María Jaenada
Pedro Miranda
Leandro Pardo
+ A generalized divergence for statistical inference 2017 Abhik Ghosh
Ian R. Harris
Avijit Maji
Ayanendranath Basu
Leandro Pardo
+ PDF Chat A Geometric Unification of Distributionally Robust Covariance Estimators: Shrinking the Spectrum by Inflating the Ambiguity Set 2024 Man–Chung Yue
Yves Rychener
Daniel Kühn
Viet Anh Nguyen
+ On the consistency and the robustness in model selection criteria 2019 Sumito Kurata
Etsuo Hamada
+ Robust Estimation in Generalized Linear Model Using Density Power Divergence 2019 Sangjin Lee
Changkon Hong
+ Robust Parameter Estimation Based on the K-Divergence 2022 Yair Sorek
Koby Todros
+ PDF Chat Restricted Distance-Type Gaussian Estimators Based on Density Power Divergence and Their Applications in Hypothesis Testing 2023 Ángel Felipe
María Jaenada
Pedro Miranda
Leandro Pardo
+ PDF Chat The extended Bregman divergence and parametric estimation 2022 Sancharee Basak
Ayanendranath Basu
+ Robust Estimation under Linear Mixed Models: The Minimum Density Power Divergence Approach 2020 Giovanni Saraceno
Abhik Ghosh
Ayanendranath Basu
Claudio Agostinelli
+ PDF Chat Robust and Sparse Regression via γ-Divergence 2017 Takayuki Kawashima
Hironori Fujisawa
+ PDF Chat Robust estimation of fixed effect parameters and variances of linear mixed models: the minimum density power divergence approach 2023 Giovanni Saraceno
Abhik Ghosh
Ayanendranath Basu
Claudio Agostinelli
+ Robust Bayes estimation using the density power divergence 2014 Abhik Ghosh
Ayanendranath Basu