Abstract
Information processing capabilities of embedded systems presently lack the robustness and rich complexity found in biological systems. Endowing artificial systems with the ability to adapt to changing conditions requires algorithms that can rapidly learn from examples. We demonstrate the application of one such learning algorithm on an inexpensive robot constructed to perform simple sensorimotor tasks. The robot learns to track a particular object by discovering the salient visual and auditory cues unique to that object. The system uses a convolutional neural network to combine color, luminance, motion, and auditory information. The weights of the networks are adjusted using feedback from a teacher to reflect the reliability of the various input channels in the surrounding environment. We also discuss how unsupervised learning can discover features in data without external interaction. An unsupervised algorithm based upon nonnegative matrix factorization is able to automatically learn the different parts of objects. Such a parts-based representation of data is crucial for robust object recognition.
Original language | English (US) |
---|---|
State | Published - 1999 |
Externally published | Yes |
Event | USENIX Workshop on Embedded Systems 1999 - Cambridge, United States Duration: Mar 29 1999 → Mar 31 1999 |
Conference
Conference | USENIX Workshop on Embedded Systems 1999 |
---|---|
Country/Territory | United States |
City | Cambridge |
Period | 3/29/99 → 3/31/99 |
All Science Journal Classification (ASJC) codes
- Software
- Electrical and Electronic Engineering
- Hardware and Architecture