Fast computation with spikes in a recurrent neural network

Dezhe Z. Jin, H. Sebastian Seung

Research output: Contribution to journalArticlepeer-review

26 Scopus citations

Abstract

Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M>1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner.

Original languageEnglish (US)
Article number051922
Pages (from-to)051922/1-051922/4
JournalPhysical Review E - Statistical, Nonlinear, and Soft Matter Physics
Volume65
Issue number5
DOIs
StatePublished - May 2002
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Condensed Matter Physics
  • Statistical and Nonlinear Physics
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'Fast computation with spikes in a recurrent neural network'. Together they form a unique fingerprint.

Cite this