Abstract
Back-propagation (BP) is a gradient-descent method to search for optimal parameter settings in a neural network model. This learning process can be split into three stages according to the behavior of the errors produced. The analysis of the different behaviors in these stages shows the existence of poorly trained patterns which have great influence on the performance of the BP mode. The benefit of considering the relative effectiveness of training patterns is investigated using two modified BP training procedures. The versions are based on information about relative importance. The results show that the learning speed and generalization ability are improved.
Original language | English (US) |
---|---|
Pages | 673-678 |
Number of pages | 6 |
State | Published - 1990 |
Event | 1990 International Joint Conference on Neural Networks - IJCNN 90 - San Diego, CA, USA Duration: Jun 17 1990 → Jun 21 1990 |
Other
Other | 1990 International Joint Conference on Neural Networks - IJCNN 90 |
---|---|
City | San Diego, CA, USA |
Period | 6/17/90 → 6/21/90 |
All Science Journal Classification (ASJC) codes
- General Engineering