Two critical issues in back-propagation (BP) learning are the discrimination capability given a number of hidden units and the speed of convergence in learning. The number of hidden units must be sufficient to provide the discriminating capability required by the given application. On the other hand, the training of an excessively large number of synaptic weights may be computationally costly and unreliable. This makes it desirable to have an a priori estimate of an optimal number of hidden neurons. Another closely related issue is the learning rate of the BP rule. In general, it is desirable to have fast learning, but not so fast that brings about instability of the iterative computation. An algebraic projection (AP) analysis method is proposed that provides an analytical solution to both of these problems. If the training patterns are completely irregular, then the predicted optimal number of hidden neurons is the same as that of the training patterns. In the case of regularity embedded patterns, the number of hidden neurons will depend on the type of regularity inherent. The optimal learning rate parameter is found to be inversely proportional to the number of hidden neurons.
|Original language||English (US)|
|Number of pages||8|
|State||Published - Jan 1 1988|
All Science Journal Classification (ASJC) codes