Equivalence of backpropagation and contrastive Hebbian learning in a layered network

Xiaohui Xie, H. Sebastian Seung

Research output: Contribution to journalArticlepeer-review

90 Scopus citations


Backpropagation and contrastive Hebbian learning are two methods of training networks with hidden neurons. Backpropagation computes an error signal for the output neurons and spreads it over the hidden neurons. Contrastive Hebbian learning involves clamping the output neurons at desired values and letting the effect spread through feedback connections over the entire network. To investigate the relationship between these two forms of learning, we consider a special case in which they are identical: a multilayer perceptron with linear output units, to which weak feedback connections have been added. In this case, the change in network state caused by clamping the output neurons turns out to be the same as the error signal spread by backpropagation, except for a scalar prefactor. This suggests that the functionality of backpropagation can be realized alternatively by a Hebbian-type learning algorithm, which is suitable for implementation in biological networks.

Original languageEnglish (US)
Pages (from-to)441-454
Number of pages14
JournalNeural computation
Issue number2
StatePublished - Feb 2003
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience


Dive into the research topics of 'Equivalence of backpropagation and contrastive Hebbian learning in a layered network'. Together they form a unique fingerprint.

Cite this