## Abstract

This paper presents a polynomial-time algorithm for inferring a probabilistic generalization of the class of read-once Boolean formulas over the usual basis { AND, OR, NOT }. The algorithm effectively infers a good approximation of the target formula when provided with random examples which are chosen according to any product distribution, i.e., any distribution in which the setting of each input bit is chosen independently of the settings of the other bits. Since the class of formulas considered includes ordinary read-once Boolean formulas, our result shows that such formulas are PAC learnable (in the sense of Valiant) against any product distribution (for instance, against the uniform distribution). Further, this class of probabilistic formulas includes read-once formulas whose behavior has been corrupted by large amounts of random noise. Such noise may affect the formula's output (“misclassification noise”), the input bits ( “attribute noise”), or it may affect the behavior of individual gates of the formula. Thus, in this setting, we show that read-once formula's can be inferred (approximately), despite large amounts of noise affecting the formula's behavior.

Original language | English (US) |
---|---|

Pages (from-to) | 47-81 |

Number of pages | 35 |

Journal | Machine Learning |

Volume | 14 |

Issue number | 1 |

DOIs | |

State | Published - Jan 1994 |

Externally published | Yes |

## All Science Journal Classification (ASJC) codes

- Software
- Artificial Intelligence

## Keywords

- computational learning theory
- learning with noise
- PAC-learning
- product distributions
- read-once formulas