Algebraic projection analysis for back-propagation learning

S. Y. Kung, J. N. Hwang

Research output: Contribution to journalConference articlepeer-review

Abstract

Two critical issues in the back-propagation (BP) learning are the discrimination capability given a number of hidden units and the speed of convergence in learning. The number of hidden units must be sufficient to provide the discriminating capability required by the given application. On the other hand, the training of an excessively large number of synaptic weights may be computationally costly and unreliable. This makes it desirable to have an a priori estimate of an optimal number of hidden neurons. Another closely related issue is the learning rate of the BP scheme. In general, it is desirable to have a fast learning, but not so fast as incurring instability of the iterative computation. Based on this principle an optimal learning rate should be determined. An algebraic projection analysis (APA) is proposed that provides an analytical solution to both of the problems. The simulation results support the optimal hidden units size and learning rate as theoretically predicted by the APA.

Original languageEnglish (US)
Pages (from-to)547
Number of pages1
JournalNeural Networks
Volume1
Issue number1 SUPPL
DOIs
StatePublished - 1988
EventInternational Neural Network Society 1988 First Annual Meeting - Boston, MA, USA
Duration: Sep 6 1988Sep 10 1988

All Science Journal Classification (ASJC) codes

  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Algebraic projection analysis for back-propagation learning'. Together they form a unique fingerprint.

Cite this