Abstract
The effect of the structure of the input distribution on the complexity of learning a pattern classification task is investigated. Using statistical mechanics, we study the performance of a winner-take-all machine at learning to classify points generated by a mixture of K Gaussian distributions (''clusters'') in RN with intercluster distance u (relative to the cluster width). In the separation limit u1, the number of examples required for learning scales as NKu-p, where the exponent p is 2 for zero-temperature Gibbs learning and 4 for the Hebb rule.
Original language | English (US) |
---|---|
Pages (from-to) | 3167-3170 |
Number of pages | 4 |
Journal | Physical review letters |
Volume | 70 |
Issue number | 20 |
DOIs | |
State | Published - 1993 |
Externally published | Yes |
All Science Journal Classification (ASJC) codes
- General Physics and Astronomy