Abstract
In this article, we provide a tutorial overview of some aspects of statistical learning theory, which also goes by other names such as statistical pattern recognition, nonparametric classification and estimation, and supervised learning. We focus on the problem of two-class pattern classification for various reasons. This problem is rich enough to capture many of the interesting aspects that are present in the cases of more than two classes and in the problem of estimation, and many of the results can be extended to these cases. Focusing on two-class pattern classification simplifies our discussion, and yet it is directly applicable to a wide range of practical settings. We begin with a description of the two-class pattern recognition problem. We then discuss various classical and state-of-the-art approaches to this problem, with a focus on fundamental formulations, algorithms, and theoretical results. In particular, we describe nearest neighbor methods, kernel methods, multilayer perceptrons, Vapnik-Chervonenkis theory, support vector machines, and boosting. WIREs Comp Stat 2011 3 543-556 DOI: 10.1002/wics.179 For further resources related to this article, please visit the WIREs website.
Original language | English (US) |
---|---|
Pages (from-to) | 543-556 |
Number of pages | 14 |
Journal | Wiley Interdisciplinary Reviews: Computational Statistics |
Volume | 3 |
Issue number | 6 |
DOIs | |
State | Published - Nov 2011 |
All Science Journal Classification (ASJC) codes
- Statistics and Probability
Keywords
- Classification
- Kernel methods
- Pattern recognition
- Statistical learning
- Supervised learning