The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators
The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators
Since the time of Gauss, it has been generally accepted that $\ell_2$-methods of combining observations by minimizing sums of squared errors have significant computational advantages over earlier $\ell_1$-methods based on minimization of absolute errors advocated by Boscovich, Laplace and others. However, $\ell_1$-methods are known to have significant robustness advantages over …