Rigorous learning curve bounds from statistical mechanics

David Haussler, Hyunjune Sebastian Seung, Michael Kearns, Naftali Tishby

Research output: Chapter in Book/Report/Conference proceedingConference contribution

32 Scopus citations

Abstract

In this paper we introduce and investigate a mathematically rigorous theory of learning curves that is based on ideas from statistical mechanics. The advantage of our theory over the well-established Vapnik-Chervonenkis theory is that our bounds can be considerably tighter in many cases, and are also more reflective of the true behavior (functional form) of learning curves. This behavior can often exhibit dramatic properties such as phase transitions, as well as power law asymptotics not explained by the VC theory. The disadvantages of our theory are that its application requires knowledge of the input distribution, and it is limited so far to finite cardinality function classes. We illustrate our results with many concrete examples of learning curve bounds derived from our theory.

Original languageEnglish (US)
Title of host publicationProceedings of the 7th Annual Conference on Computational Learning Theory, COLT 1994
PublisherAssociation for Computing Machinery
Pages76-87
Number of pages12
ISBN (Electronic)0897916557
DOIs
StatePublished - Jul 16 1994
Event7th Annual Conference on Computational Learning Theory, COLT 1994 - New Brunswick, United States
Duration: Jul 12 1994Jul 15 1994

Publication series

NameProceedings of the Annual ACM Conference on Computational Learning Theory
VolumePart F129415

Other

Other7th Annual Conference on Computational Learning Theory, COLT 1994
CountryUnited States
CityNew Brunswick
Period7/12/947/15/94

All Science Journal Classification (ASJC) codes

  • Software
  • Theoretical Computer Science
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Rigorous learning curve bounds from statistical mechanics'. Together they form a unique fingerprint.

Cite this