Learning decision rules for pattern classification under a family of probability measures

S. R. Kulkarni, M. Vidyasagar

Research output: Contribution to conferencePaper

Abstract

In this paper, the PAC learnability of decision rules for pattern classification under a family of probability measures is investigated. It is shown that uniform boundedness of the metric entropy of the class of decision rules is both necessary and sufficient for learnability if the family of probability measures is either compact, or contains an interior point, with respect to total variation metric. Then it is shown that learnability is preserved under finite unions of families of probability measures, and also that learnability with respect to each of a finite number of measures implies learnability with respect to the convex hull of the families of 'commensurate' probability measures.

Original languageEnglish (US)
StatePublished - Dec 1 1994
EventProceedings of the 1994 IEEE International Symposium on Information Theory - Trodheim, Norw
Duration: Jun 27 1994Jul 1 1994

Other

OtherProceedings of the 1994 IEEE International Symposium on Information Theory
CityTrodheim, Norw
Period6/27/947/1/94

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Learning decision rules for pattern classification under a family of probability measures'. Together they form a unique fingerprint.

  • Cite this

    Kulkarni, S. R., & Vidyasagar, M. (1994). Learning decision rules for pattern classification under a family of probability measures. Paper presented at Proceedings of the 1994 IEEE International Symposium on Information Theory, Trodheim, Norw, .