Statistical Learning Theory is the basic theory behind contemporary machine learning and pattern recognition. It suggests that the theory provides an excellent framework for the philosophy of induction. There are various paradigmatic approaches to specifying the problem of induction. It assumes one has an initial known subjective probability distribution satisfying certain more or less weak conditions along with a method for updating one's probabilities, e.g. by conditionalization, and proves theorems about the results of such a method. Statistical learning theory represents another paradigm which assumes there is an unknown objective probability distribution that characterizes the data and the new cases about which inferences are to be made, the goal being to do as well as possible in characterizing the new cases in terms of that unknown objective probability distribution. The basic theory attempts to specify what can be proved about various methods for using data to reach conclusions about new cases.
|Original language||English (US)|
|Title of host publication||Philosophy of Statistics|
|Subtitle of host publication||Volume 7 in Handbook of the Philosophy of Science|
|Number of pages||15|
|State||Published - Jan 1 2011|
All Science Journal Classification (ASJC) codes