Abstract
Two critical issues in the back-propagation (BP) learning are the discrimination capability given a number of hidden units and the speed of convergence in learning. The number of hidden units must be sufficient to provide the discriminating capability required by the given application. On the other hand, the training of an excessively large number of synaptic weights may be computationally costly and unreliable. This makes it desirable to have an a priori estimate of an optimal number of hidden neurons. Another closely related issue is the learning rate of the BP scheme. In general, it is desirable to have a fast learning, but not so fast as incurring instability of the iterative computation. Based on this principle an optimal learning rate should be determined. An algebraic projection analysis (APA) is proposed that provides an analytical solution to both of the problems. The simulation results support the optimal hidden units size and learning rate as theoretically predicted by the APA.
Original language | English (US) |
---|---|
Pages (from-to) | 547 |
Number of pages | 1 |
Journal | Neural Networks |
Volume | 1 |
Issue number | 1 SUPPL |
DOIs | |
State | Published - 1988 |
Event | International Neural Network Society 1988 First Annual Meeting - Boston, MA, USA Duration: Sep 6 1988 → Sep 10 1988 |
All Science Journal Classification (ASJC) codes
- Cognitive Neuroscience
- Artificial Intelligence