Block Basis Factorization for Scalable Kernel Evaluation

Type: Article

Publication Date: 2019-01-01

Citations: 20

DOI: https://doi.org/10.1137/18m1212586

Abstract

Kernel methods are widespread in machine learning; however, they are limited by the quadratic complexity of the construction, application, and storage of kernel matrices. Low-rank matrix approximation algorithms are widely used to address this problem and reduce the arithmetic and storage cost. However, we observed that for some datasets with wide intraclass variability, the optimal kernel parameter for smaller classes yields a matrix that is less well-approximated by low-rank methods. In this paper, we propose an efficient structured low-rank approximation method---the block basis factorization (BBF)---and its fast construction algorithm to approximate radial basis function kernel matrices. Our approach has linear memory cost and floating point operations for many machine learning kernels. BBF works for a wide range of kernel bandwidth parameters and extends the domain of applicability of low-rank approximation methods significantly. Our empirical results demonstrate the stability and superiority over the state-of-the-art kernel approximation algorithms.

Locations

  • SIAM Journal on Matrix Analysis and Applications - View
  • arXiv (Cornell University) - View - PDF
  • DataCite API - View

Similar Works

Action Title Year Authors
+ Structured Block Basis Factorization for Scalable Kernel Matrix Evaluation. 2015 Ruoxi Wang
Yingzhou Li
Michael W. Mahoney
Eric Darve
+ Low-rank decomposition meets kernel learning: A generalized Nyström method 2017 Liang Lan
Kai Zhang
Hancheng Ge
Wei Cheng
Jun Liu
Andreas Rauber
Xiaoli Li
Jun Wang
Hongyuan Zha
+ Hierarchically Compositional Kernels for Scalable Nonparametric Learning 2016 Jie Chen
Haim Avron
Vikas Sindhwani
+ Hierarchically Compositional Kernels for Scalable Nonparametric Learning 2016 Jie Chen
Haim Avron
Vikas Sindhwani
+ Scalable Nonparametric Low-Rank Kernel Learning Using Block Coordinate Descent 2014 Enliang Hu
James T. Kwok
+ PDF Chat On the Numerical Rank of Radial Basis Function Kernels in High Dimensions 2018 Ruoxi Wang
Yingzhou Li
Eric Darve
+ PDF Chat Predictive low-rank decomposition for kernel methods 2005 Francis Bach
Michael I. Jordan
+ PDF Chat Spectrum-Revealing Cholesky Factorization for Kernel Methods 2016 Jianwei Xiao
Ming Gu
+ Learning low-rank kernel matrices 2006 Brian Kulis
Mátyás A. Sustik
Inderjit S. Dhillon
+ Inductive Kernel Low-rank Decomposition with Priors: A Generalized Nystrom Method 2012 Kai Zhang
Liang Lan
Jun Liu
Andreas Rauber
Fabian Moerchen
+ Inductive Kernel Low-rank Decomposition with Priors: A Generalized Nystrom Method 2012 Kai Zhang
Liang Lan
Jun Liu
Andreas Rauber
Fabian Moerchen
+ An $N \log N$ Parallel Fast Direct Solver for Kernel Matrices 2017 Chenhan D. Yu
William B. March
George Biros
+ Fastfood: Approximate Kernel Expansions in Loglinear Time 2014 Quoc V. Le
Tamás Sarlós
Alexander J. Smola
+ Fastfood: Approximate Kernel Expansions in Loglinear Time 2014 Quoc V. Le
Tamás Sarlós
Alexander J. Smola
+ An $N \log N$ Parallel Fast Direct Solver for Kernel Matrices 2017 Chenhan D. Yu
William B. March
George Biros
+ PDF Chat Learning in high-dimensional feature spaces using ANOVA-based fast matrix-vector multiplication 2022 Franziska Nestler
Martin Stoll
Theresa Wagner
+ Nonlinear Matrix Approximation with Radial Basis Function Components 2021 Elizaveta Rebrova
Yu-Hang Tang
+ Large Scale Kernel Learning using Block Coordinate Descent 2016 Stephen Tu
Rebecca Roelofs
Shivaram Venkataraman
Benjamin Recht
+ Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices 2013 Lo‐Bin Chang
Zhidong Bai
Su-Yun Huang
Chii-Ruey Hwang
+ PDF Chat Approximate l-Fold Cross-Validation with Least Squares SVM and Kernel Ridge Regression 2013 Richard E. Edwards
Hao Zhang
Lynne E. Parker
Joshua New