Asymptotic Theory of Outlier Detection Algorithms for Linear Time Series Regression Models
Asymptotic Theory of Outlier Detection Algorithms for Linear Time Series Regression Models
Abstract Outlier detection algorithms are intimately connected with robust statistics that downâweight some observations to zero. We define a number of outlier detection algorithms related to the Huberâskip and least trimmed squares estimators, including the oneâstep Huberâskip estimator and the forward search. Next, we review a recently developed asymptotic theory âŚ