A Metric Entropy Bound is Not Sufficient for Learnability

R. M. Dudley, S. R. Kulkarni, T. Richardson, O. Zeitouni

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

We prove by means of a counterexample that it is not sufficient, for probably approximately correct (PAC) learning under a class of distributions, to have a uniform bound on the metric entropy of the class of concepts to be learned. This settles a conjecture of Benedek and Itai.

Original languageEnglish (US)
Pages (from-to)883-885
Number of pages3
JournalIEEE Transactions on Information Theory
Volume40
Issue number3
DOIs
StatePublished - May 1994

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Keywords

  • Learning
  • PAC
  • class of distributions
  • estimation
  • metric entropy

Fingerprint

Dive into the research topics of 'A Metric Entropy Bound is Not Sufficient for Learnability'. Together they form a unique fingerprint.

Cite this