A Metric Entropy Bound is Not Sufficient for Learnability

R. M. Dudley, S. R. Kulkarni, T. Richardson, O. Zeitouni

Research output: Contribution to journalArticlepeer-review

8 Scopus citations


We prove by means of a counterexample that it is not sufficient, for probably approximately correct (PAC) learning under a class of distributions, to have a uniform bound on the metric entropy of the class of concepts to be learned. This settles a conjecture of Benedek and Itai.

Original languageEnglish (US)
Pages (from-to)883-885
Number of pages3
JournalIEEE Transactions on Information Theory
Issue number3
StatePublished - May 1994

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences


  • Learning
  • PAC
  • class of distributions
  • estimation
  • metric entropy


Dive into the research topics of 'A Metric Entropy Bound is Not Sufficient for Learnability'. Together they form a unique fingerprint.

Cite this