Type: Article
Publication Date: 2019-01-01
Citations: 20
DOI: https://doi.org/10.1137/18m1212586
Kernel methods are widespread in machine learning; however, they are limited by the quadratic complexity of the construction, application, and storage of kernel matrices. Low-rank matrix approximation algorithms are widely used to address this problem and reduce the arithmetic and storage cost. However, we observed that for some datasets with wide intraclass variability, the optimal kernel parameter for smaller classes yields a matrix that is less well-approximated by low-rank methods. In this paper, we propose an efficient structured low-rank approximation method---the block basis factorization (BBF)---and its fast construction algorithm to approximate radial basis function kernel matrices. Our approach has linear memory cost and floating point operations for many machine learning kernels. BBF works for a wide range of kernel bandwidth parameters and extends the domain of applicability of low-rank approximation methods significantly. Our empirical results demonstrate the stability and superiority over the state-of-the-art kernel approximation algorithms.