TY - JOUR
T1 - Learning decision rules for pattern classification under a family of probability measures
AU - Kulkarni, Sanjeev R.
AU - Vidyasagar, Mathukumalli
N1 - Funding Information:
Manuscript received November 1, 1993; revised June 10, 1996. The work of S. R. Kulkami was supported in part by the National Science Foundation under Grant IRI-9209577 and NYI Award RI-9457645 and by the U.S. Army Research Office under Grant DAAL03-92-(3-0320. S. R. Kulkami is with the Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 USA. M. Vidyasagar is with the Centre for Artificial Intelligence and Robotics, Bangalore 560 001, India. Publisher Item Identifier S 0018-9448(97)00162-4.
PY - 1997
Y1 - 1997
N2 - In this paper, uniformly consistent estimation (learnability) of decision rules for pattern classification under a family of probability measures is investigated. In particular, it is shown that uniform boundedness of the metric entropy of the class of decision rules is both necessary and sufficient for learnability under each of two conditions: i) the family of probability measures is totally bounded, with respect to the total variation metric, and ii) the family of probability measures contains an interior point, when equipped with the same metric. In particular, this shows that insofar as uniform consistency is concerned, when the family of distributions contains a total variation neighborhood, nothing is gained by this knowledge about the distribution. Then two sufficient conditions for learnability are presented. Specifically, it is shown that learnability with respect to each of a finite collection of families of probability measures implies learnability with respect to their union; also, learnability with respect to each of a finite number of measures implies learnability with respect to the convex hull of the corresponding families of uniformly absolutely continuous probability measures.
AB - In this paper, uniformly consistent estimation (learnability) of decision rules for pattern classification under a family of probability measures is investigated. In particular, it is shown that uniform boundedness of the metric entropy of the class of decision rules is both necessary and sufficient for learnability under each of two conditions: i) the family of probability measures is totally bounded, with respect to the total variation metric, and ii) the family of probability measures contains an interior point, when equipped with the same metric. In particular, this shows that insofar as uniform consistency is concerned, when the family of distributions contains a total variation neighborhood, nothing is gained by this knowledge about the distribution. Then two sufficient conditions for learnability are presented. Specifically, it is shown that learnability with respect to each of a finite collection of families of probability measures implies learnability with respect to their union; also, learnability with respect to each of a finite number of measures implies learnability with respect to the convex hull of the corresponding families of uniformly absolutely continuous probability measures.
KW - Class of distributions
KW - Decision rules
KW - Estimation
KW - Metric entropy
KW - Pac learning
KW - Pattern classification
KW - Uniform consistency
KW - Vc dimension
UR - http://www.scopus.com/inward/record.url?scp=0030736447&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0030736447&partnerID=8YFLogxK
U2 - 10.1109/18.567668
DO - 10.1109/18.567668
M3 - Article
AN - SCOPUS:0030736447
SN - 0018-9448
VL - 43
SP - 154
EP - 166
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 1
ER -