Regularized variational bayesian learning of echo state networks with delay&sum readout

Dmitriy Shutin, Christoph Zechner, Sanjeev R. Kulkarni, H. Vincent Poor

Research output: Contribution to journalLetterpeer-review

13 Scopus citations


In this work, a variational Bayesian framework for efficient training of echo state networks (ESNs) with automatic regularization and delay&sum (D&S) readout adaptation is proposed. The algorithm uses a classical batch learning of ESNs. By treating the network echo states as fixed basis functions parameterized with delay parameters, we propose a variational Bayesian ESN training scheme. The variational approach allows for a seamless combination of sparse Bayesian learning ideas and a variational Bayesian space-alternating generalized expectationmaximization (VB-SAGE) algorithm for estimating parameters of superimposed signals. While the former method realizes automatic regularization of ESNs, which also determines which echo states and input signals are relevant for "explaining" the desired signal, the latter method provides a basis for joint estimation of D&S readout parameters. The proposed training algorithm can naturally be extended to ESNs with fixed filter neurons. It also generalizes the recently proposed expectationmaximization- based D&S readout adaptation method. The proposed algorithm was tested on synthetic data prediction tasks as well as on dynamic handwritten character recognition.

Original languageEnglish (US)
Pages (from-to)967-995
Number of pages29
JournalNeural computation
Issue number4
StatePublished - 2012

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience


Dive into the research topics of 'Regularized variational bayesian learning of echo state networks with delay&sum readout'. Together they form a unique fingerprint.

Cite this