Statistical Learning Theory as a Framework for the Philosophy of Induction

Gilbert Harman, Sanjeev Kulkarni

Research output: Chapter in Book/Report/Conference proceedingChapter

4 Scopus citations

Abstract

Statistical Learning Theory is the basic theory behind contemporary machine learning and pattern recognition. It suggests that the theory provides an excellent framework for the philosophy of induction. There are various paradigmatic approaches to specifying the problem of induction. It assumes one has an initial known subjective probability distribution satisfying certain more or less weak conditions along with a method for updating one's probabilities, e.g. by conditionalization, and proves theorems about the results of such a method. Statistical learning theory represents another paradigm which assumes there is an unknown objective probability distribution that characterizes the data and the new cases about which inferences are to be made, the goal being to do as well as possible in characterizing the new cases in terms of that unknown objective probability distribution. The basic theory attempts to specify what can be proved about various methods for using data to reach conclusions about new cases.

Original languageEnglish (US)
Title of host publicationPhilosophy of Statistics
PublisherElsevier
Pages833-847
Number of pages15
ISBN (Print)9780444518620
DOIs
StatePublished - 2011

All Science Journal Classification (ASJC) codes

  • General Mathematics

Fingerprint

Dive into the research topics of 'Statistical Learning Theory as a Framework for the Philosophy of Induction'. Together they form a unique fingerprint.

Cite this