Abstract
One approach to invariant object recognition employs a recurrent neural network as an associative memory. In the standard depiction of the network's state space, memories of objects are stored as attractive fixed points of the dynamics. I argue for a modification of this picture: if an object has a continuous family of instantiations, it should be represented by a continuous attractor. This idea is illustrated with a network that learns to complete patterns. To perform the task of filling in missing information, the network develops a continuous attractor that models the manifold from which the patterns are drawn. Prom a statistical viewpoint, the pattern completion task allows a formulation of unsupervised learning in terms of regression rather than density estimation.
Original language | English (US) |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 10 - Proceedings of the 1997 Conference, NIPS 1997 |
Publisher | Neural information processing systems foundation |
Pages | 654-660 |
Number of pages | 7 |
ISBN (Print) | 0262100762, 9780262100762 |
State | Published - Jan 1 1998 |
Externally published | Yes |
Event | 11th Annual Conference on Neural Information Processing Systems, NIPS 1997 - Denver, CO, United States Duration: Dec 1 1997 → Dec 6 1997 |
Other
Other | 11th Annual Conference on Neural Information Processing Systems, NIPS 1997 |
---|---|
Country/Territory | United States |
City | Denver, CO |
Period | 12/1/97 → 12/6/97 |
All Science Journal Classification (ASJC) codes
- Computer Networks and Communications
- Information Systems
- Signal Processing