Representing part-whole relationships in recurrent neural networks

Viren Jain, Valentin Zhigulin, H. Sebastian Seung

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

There is little consensus about the computational function of top-down synaptic connections in the visual system. Here we explore the hypothesis that top-down connections, like bottom-up connections, reflect part whole relationships. We analyze a recurrent network with bidirectional synaptic interactions between a layer of neurons representing parts and a layer of neurons representing wholes. Within each layer, there is lateral inhibition. When the network detects a whole, it can rigorously enforce part-whole relationships by ignoring parts that do not belong. The network can complete the whole by filling in missing parts. The network can refuse to recognize a whole, if the activated parts do not conform to a stored part-whole relationship. Parameter regimes in which these behaviors happen are identified using the theory of permitted and forbidden sets [3, 4]. The network behaviors are illustrated by recreating Rumelhart and McClelland's "interactive activation" model [7].

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 18 - Proceedings of the 2005 Conference
Pages563-570
Number of pages8
StatePublished - 2005
Externally publishedYes
Event2005 Annual Conference on Neural Information Processing Systems, NIPS 2005 - Vancouver, BC, Canada
Duration: Dec 5 2005Dec 8 2005

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other2005 Annual Conference on Neural Information Processing Systems, NIPS 2005
Country/TerritoryCanada
CityVancouver, BC
Period12/5/0512/8/05

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Representing part-whole relationships in recurrent neural networks'. Together they form a unique fingerprint.

Cite this