TY - GEN

T1 - A note on the use of probabilities by mechanical learners

AU - Martin, Eric

AU - Osherson, Daniel

PY - 1995/1/1

Y1 - 1995/1/1

N2 - We raise the following problem: in a probabilistic context, is it always fruitful for a machine to compute probabilities? The question is made precise in a paradigm of the limit-identification kind, where a learner must discover almost surely whether an infinite sequence of heads and tails belongs to an effective subset S of the Cantor space. In this context, a successful strategy for an ineffective learner is to compute, at each stage, the conditional probability that he is faced with an element of 5, given the data received so far. We show that an effective learner should not proceed this way in all circumstances. Indeed, even if he gets art optimal description of a set S, and even if some machine can always compute the conditional probability for S given any data, an effective learner optimizes his inductive competence only if he does not compute the relevant probabilities. We conclude that the advice "compute probabilities whenever you can" should sometimes be received with caution in the context of machine learning.

AB - We raise the following problem: in a probabilistic context, is it always fruitful for a machine to compute probabilities? The question is made precise in a paradigm of the limit-identification kind, where a learner must discover almost surely whether an infinite sequence of heads and tails belongs to an effective subset S of the Cantor space. In this context, a successful strategy for an ineffective learner is to compute, at each stage, the conditional probability that he is faced with an element of 5, given the data received so far. We show that an effective learner should not proceed this way in all circumstances. Indeed, even if he gets art optimal description of a set S, and even if some machine can always compute the conditional probability for S given any data, an effective learner optimizes his inductive competence only if he does not compute the relevant probabilities. We conclude that the advice "compute probabilities whenever you can" should sometimes be received with caution in the context of machine learning.

UR - http://www.scopus.com/inward/record.url?scp=84955569647&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84955569647&partnerID=8YFLogxK

U2 - 10.1007/3-540-59119-2_183

DO - 10.1007/3-540-59119-2_183

M3 - Conference contribution

AN - SCOPUS:84955569647

SN - 9783540591191

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 261

EP - 271

BT - Computational Learning Theory - 2nd European Conference, EuroCOLT 1995, Proceedings

A2 - Vitanyi, Paul

PB - Springer Verlag

T2 - 2nd European Conference on Computational Learning Theory, EuroCOLT 1995

Y2 - 13 March 1995 through 15 March 1995

ER -