Statistical learning theory: A tutorial

Sanjeev R. Kulkarni, Gilbert Harman

Research output: Contribution to journalArticlepeer-review

37 Scopus citations


In this article, we provide a tutorial overview of some aspects of statistical learning theory, which also goes by other names such as statistical pattern recognition, nonparametric classification and estimation, and supervised learning. We focus on the problem of two-class pattern classification for various reasons. This problem is rich enough to capture many of the interesting aspects that are present in the cases of more than two classes and in the problem of estimation, and many of the results can be extended to these cases. Focusing on two-class pattern classification simplifies our discussion, and yet it is directly applicable to a wide range of practical settings. We begin with a description of the two-class pattern recognition problem. We then discuss various classical and state-of-the-art approaches to this problem, with a focus on fundamental formulations, algorithms, and theoretical results. In particular, we describe nearest neighbor methods, kernel methods, multilayer perceptrons, Vapnik-Chervonenkis theory, support vector machines, and boosting. WIREs Comp Stat 2011 3 543-556 DOI: 10.1002/wics.179 For further resources related to this article, please visit the WIREs website.

Original languageEnglish (US)
Pages (from-to)543-556
Number of pages14
JournalWiley Interdisciplinary Reviews: Computational Statistics
Issue number6
StatePublished - Nov 2011

All Science Journal Classification (ASJC) codes

  • Statistics and Probability


  • Classification
  • Kernel methods
  • Pattern recognition
  • Statistical learning
  • Supervised learning


Dive into the research topics of 'Statistical learning theory: A tutorial'. Together they form a unique fingerprint.

Cite this