How inhibitory oscillations can train neural networks and punish competitors

Kenneth A. Norman, Ehren Newman, Greg Detre, Sean Polyn

Research output: Contribution to journalArticlepeer-review

79 Scopus citations

Abstract

We present a new learning algorithm that leverages oscillations in the strength of neural inhibition to train neural networks. Raising inhibition can be used to identify weak parts of target memories, which are then strengthened. Conversely, lowering inhibition can be used to identify competitors, which are then weakened. To update weights, we apply the Contrastive Hebbian Learning equation to successive time steps of the network. The sign of the weight change equation varies as a function of the phase of the inhibitory oscillation. We show that the learning algorithm can memorize large numbers of correlated input patterns without collapsing and that it shows good generalization to test patterns that do not exactly match studied patterns.

Original languageEnglish (US)
Pages (from-to)1577-1610
Number of pages34
JournalNeural computation
Volume18
Issue number7
DOIs
StatePublished - 2006

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'How inhibitory oscillations can train neural networks and punish competitors'. Together they form a unique fingerprint.

Cite this