Relative effectiveness of training set patterns for back propagation

Raymond K.M. Cheung, Irving Lustig, Alain L. Kornhauser

Research output: Contribution to conferencePaperpeer-review

17 Scopus citations

Abstract

Back-propagation (BP) is a gradient-descent method to search for optimal parameter settings in a neural network model. This learning process can be split into three stages according to the behavior of the errors produced. The analysis of the different behaviors in these stages shows the existence of poorly trained patterns which have great influence on the performance of the BP mode. The benefit of considering the relative effectiveness of training patterns is investigated using two modified BP training procedures. The versions are based on information about relative importance. The results show that the learning speed and generalization ability are improved.

Original languageEnglish (US)
Pages673-678
Number of pages6
StatePublished - 1990
Event1990 International Joint Conference on Neural Networks - IJCNN 90 - San Diego, CA, USA
Duration: Jun 17 1990Jun 21 1990

Other

Other1990 International Joint Conference on Neural Networks - IJCNN 90
CitySan Diego, CA, USA
Period6/17/906/21/90

All Science Journal Classification (ASJC) codes

  • General Engineering

Fingerprint

Dive into the research topics of 'Relative effectiveness of training set patterns for back propagation'. Together they form a unique fingerprint.

Cite this