Algebraic projection analysis for optimal hidden units size and learning rates in back-propagation learning

S. Y. Kung, J. N. Hwang

Research output: Contribution to conferencePaper

103 Scopus citations

Abstract

Two critical issues in back-propagation (BP) learning are the discrimination capability given a number of hidden units and the speed of convergence in learning. The number of hidden units must be sufficient to provide the discriminating capability required by the given application. On the other hand, the training of an excessively large number of synaptic weights may be computationally costly and unreliable. This makes it desirable to have an a priori estimate of an optimal number of hidden neurons. Another closely related issue is the learning rate of the BP rule. In general, it is desirable to have fast learning, but not so fast that brings about instability of the iterative computation. An algebraic projection (AP) analysis method is proposed that provides an analytical solution to both of these problems. If the training patterns are completely irregular, then the predicted optimal number of hidden neurons is the same as that of the training patterns. In the case of regularity embedded patterns, the number of hidden neurons will depend on the type of regularity inherent. The optimal learning rate parameter is found to be inversely proportional to the number of hidden neurons.

Original languageEnglish (US)
Pages363-370
Number of pages8
DOIs
StatePublished - Jan 1 1988

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint Dive into the research topics of 'Algebraic projection analysis for optimal hidden units size and learning rates in back-propagation learning'. Together they form a unique fingerprint.

  • Cite this