Kernel subspace learning for pattern classification

Yinan Yu, Konstantinos Diamantaras, Tomas McKelvey, S. Y. Kung

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Scopus citations

Abstract

Kernel methods are nonparametric feature extraction techniques that attempt to boost the learning capability of machine learning algorithms using nonlinear transformations. However, one major challenge in its basic form is that the computational complexity and the memory requirement do not scale well with respect to the training size. Kernel approximation is commonly employed to resolve this issue. Essentially, kernel approximation is equivalent to learning an approximated subspace in the high-dimensional feature vector space induced and characterized by the kernel function. With streaming data acquisition, approximated subspaces can be constructed adaptively. Explicit feature vectors are then extracted by a transformation onto the approximated subspace and linear learning techniques can be subsequently applied. From a computational point of view, operations in kernel methods can easily be parallelized and modern infrastructures can be utilized to achieve efficient computing. Moreover, the extracted explicit feature vectors can easily be interfaced with other learning techniques.

Original languageEnglish (US)
Title of host publicationAdaptive Learning Methods for Nonlinear System Modeling
PublisherElsevier
Pages127-147
Number of pages21
ISBN (Electronic)9780128129760
ISBN (Print)9780128129777
DOIs
StatePublished - Jan 1 2018

All Science Journal Classification (ASJC) codes

  • General Engineering

Keywords

  • CUDA
  • Classification
  • GPU
  • Kernel approximation
  • Nyström
  • Spark
  • Subspace learning

Fingerprint

Dive into the research topics of 'Kernel subspace learning for pattern classification'. Together they form a unique fingerprint.

Cite this