Abstract
In this paper we describe a neural network model (APEX) for multiple principal component extraction. All the synaptic weights of the model are trained with the normalized Hebbian learning rule. The network structure features a hierarchical set of lateral connections among the output units which serve the purpose of weight orthogonalization. This structure also allows the size of the model to grow or shrink without need for retraining the old units. The exponential convergence of the network is formally proved while there is significant performance improvement over previous methods. By establishing an important connection with the recursive least squares algorithm we have been able to provide the optimal size for the learning step-size parameter which leads to a significant improvement in the convergence speed. This is in contrast with previous neural PCA models which lack such numerical advantages. The APEX algorithm is also parallelizable allowing the concurrent extraction of multiple principal components. Furthermore, APEX is shown to be applicable to the constrained PCA problem where the signal variance is maximized under external orthogonality constraints. We then study various principal component analysis (PCA) applications that might benefit from the adaptive solution offered by APEX. In particular we discuss applications in spectral estimation, signal detection and image compression and filtering, while other application domains are also briefly outlined.
Original language | English (US) |
---|---|
Pages (from-to) | 1202-1217 |
Number of pages | 16 |
Journal | IEEE Transactions on Signal Processing |
Volume | 42 |
Issue number | 5 |
DOIs | |
State | Published - May 1994 |
All Science Journal Classification (ASJC) codes
- Signal Processing
- Electrical and Electronic Engineering