Low-Precision Random Fourier Features for Memory-Constrained Kernel Approximation

  • Jian Zhang
  • , Avner May
  • , Tri Dao
  • , R. Christopher

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

We investigate how to train kernel approximation methods that generalize well under a memory budget. Building on recent theoretical work, we define a measure of kernel approximation error which we find to be more predictive of the empirical generalization performance of kernel approximation methods than conventional metrics. An important consequence of this definition is that a kernel approximation matrix must be high rank to attain close approximation. Because storing a high-rank approximation is memory intensive, we propose using a low-precision quantization of random Fourier features (LP-RFFs) to build a high-rank approximation under a memory budget. Theoretically, we show quantization has a negligible e↵ect on generalization performance in important settings. Empirically, we demonstrate across four benchmark datasets that LP-RFFs can match the performance of full-precision RFFs and the Nystr¨om method, with 3x-10x and 50x-460x less memory, respectively.

Original languageEnglish (US)
Pages (from-to)1264-1274
Number of pages11
JournalProceedings of Machine Learning Research
Volume89
StatePublished - 2019
Externally publishedYes
Event22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan
Duration: Apr 16 2019Apr 18 2019

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Low-Precision Random Fourier Features for Memory-Constrained Kernel Approximation'. Together they form a unique fingerprint.

Cite this