Robust matrix estimations meet Frank–Wolfe algorithm

Type: Article

Publication Date: 2023-04-05

Citations: 2

DOI: https://doi.org/10.1007/s10994-023-06325-w

Abstract

We consider estimating matrix-valued model parameters with a dedicated focus on their robustness. Our setting concerns large-scale structured data so that a regularization on the matrix's rank becomes indispensable. Though robust loss functions are expected to be effective, their practical implementations are known difficult due to the non-smooth criterion functions encountered in the optimizations. To meet the challenges, we develop a highly efficient computing scheme taking advantage of the projection-free Frank–Wolfe algorithms that require only the first-order derivative of the criterion function. Our methodological framework is broad, extensively accommodating robust loss functions in conjunction with penalty functions in the context of matrix estimation problems. We establish the non-asymptotic error bounds of the matrix estimations with the Huber loss and nuclear norm penalty in two concrete cases: matrix completion with partial and noisy observations and reduced-rank regressions. Our theory demonstrates the merits from using robust loss functions, so that matrix-valued estimators with good properties are achieved even when heavy-tailed distributions are involved. We illustrate the promising performance of our methods with extensive numerical examples and data analysis.

Locations

Similar Works

Action Title Year Authors
+ Max-Norm Optimization for Robust Matrix Recovery 2016 Ethan X. Fang
Han Liu
Kim-Chuan Toh
Wen-Xin Zhou
+ MATRIX COMPLETION MODELS WITH FIXED BASIS COEFFICIENTS AND RANK REGULARIZED PROBLEMS WITH HARD CONSTRAINTS 2013 Weimin Miao
+ Robust Matrix Completion with Heavy-tailed Noise 2022 Bingyan Wang
Jianqing Fan
+ Sparse Reduced Rank Huber Regression in High Dimensions 2022 Kean Ming Tan
Qiang Sun
Daniela Witten
+ Fast global convergence of gradient methods for high-dimensional statistical recovery 2012 Alekh Agarwal
Sahand Negahban
Martin J. Wainwright
+ Towards Faster Rates and Oracle Property for Low-Rank Matrix Estimation 2015 Huan Gui
Quanquan Gu
+ Low-rank matrix estimation in multi-response regression with measurement errors: Statistical and computational guarantees 2020 Xin Li
Dongya Wu
+ Harnessing Structures in Big Data via Guaranteed Low-Rank Matrix Estimation 2018 Yudong Chen
Yuejie Chi
+ PDF Chat A rank-corrected procedure for matrix completion with fixed basis coefficients 2015 Weimin Miao
Shaohua Pan
Defeng Sun
+ PDF Chat Robust low-rank matrix estimation 2018 Andreas Elsener
Sara van de Geer
+ Robust Sparse Reduced Rank Regression in High Dimensions 2018 Kean Ming Tan
Qiang Sun
Daniela Witten
+ Robust Sparse Reduced Rank Regression in High Dimensions 2018 Kean Ming Tan
Qiang Sun
Daniela Witten
+ Low-rank matrix estimation via nonconvex spectral regularized methods in errors-in-variables matrix regression 2025 Xin Li
Dongya Wu
+ PDF Chat Low-rank matrix estimation via nonconvex spectral regularized methods in errors-in-variables matrix regression 2024 Xin Li
Dongya Wu
+ PDF Chat Estimation of (near) low-rank matrices with noise and high-dimensional scaling 2011 Sahand Negahban
Martin J. Wainwright
+ Robust Low-Rank Matrix Estimation 2016 Andreas Elsener
Sara van de Geer
+ Robust Low-Rank Matrix Estimation 2016 Andreas Elsener
Sara van de Geer
+ A Unified Computational and Statistical Framework for Nonconvex Low-Rank Matrix Estimation 2016 Lingxiao Wang
Xiao Zhang
Quanquan Gu
+ A Unified Computational and Statistical Framework for Nonconvex Low-Rank Matrix Estimation 2016 Lingxiao Wang
Xiao Zhang
Quanquan Gu
+ Computationally Efficient and Statistically Optimal Robust High-Dimensional Linear Regression 2023 Yinan Shen
Jingyang Li
Jian‐Feng Cai
Dong Xia