TY - JOUR
T1 - Recurrent Network Models of Sequence Generation and Memory
AU - Rajan, Kanaka
AU - Harvey, Christopher D D.
AU - Tank, David W W.
N1 - Funding Information:
The authors thank Larry Abbott for providing guidance and critiques throughout this project; Eftychios Pnevmatikakis and Liam Paninski for their deconvolution algorithm ( Pnevmatikakis et al., 2016; Vogelstein et al., 2010 ); and Dmitriy Aronov, Bill Bialek, Selmaan Chettih, Cristina Domnisoru, Tim Hanks, and Matthias Minderer for comments. This work was supported by the NIH (D.W.T., R01-MH083686, RC1-NS068148, and 1U01NS090541-01; C.D.H., R01-MH107620 and R01-NS089521), a grant from the Simons Collaboration on the Global Brain (D.W.T.), a National Alliance for Research on Schizophrenia and Depression (NARSAD) Young Investigator Award from the Brain & Behavior Research Foundation (K.R.), a fellowship from the Helen Hay Whitney Foundation (C.D.H.), the New York Stem Cell Foundation (C.D.H.), and a Burroughs Wellcome Fund Career Award at the Scientific Interface (C.D.H.). C.D.H. is a New York Stem Cell Foundation-Robertson Investigator and a Searle Scholar.
Publisher Copyright:
© 2016 Elsevier Inc.
PY - 2016/4/6
Y1 - 2016/4/6
N2 - Sequential activation of neurons is a common feature of network activity during a variety of behaviors, including working memory and decision making. Previous network models for sequences and memory emphasized specialized architectures in which a principled mechanism is pre-wired into their connectivity. Here we demonstrate that, starting from random connectivity and modifying a small fraction of connections, a largely disordered recurrent network can produce sequences and implement working memory efficiently. We use this process, called Partial In-Network Training (PINning), to model and match cellular resolution imaging data from the posterior parietal cortex during a virtual memory-guided two-alternative forced-choice task. Analysis of the connectivity reveals that sequences propagate by the cooperation between recurrent synaptic interactions and external inputs, rather than through feedforward or asymmetric connections. Together our results suggest that neural sequences may emerge through learning from largely unstructured network architectures.
AB - Sequential activation of neurons is a common feature of network activity during a variety of behaviors, including working memory and decision making. Previous network models for sequences and memory emphasized specialized architectures in which a principled mechanism is pre-wired into their connectivity. Here we demonstrate that, starting from random connectivity and modifying a small fraction of connections, a largely disordered recurrent network can produce sequences and implement working memory efficiently. We use this process, called Partial In-Network Training (PINning), to model and match cellular resolution imaging data from the posterior parietal cortex during a virtual memory-guided two-alternative forced-choice task. Analysis of the connectivity reveals that sequences propagate by the cooperation between recurrent synaptic interactions and external inputs, rather than through feedforward or asymmetric connections. Together our results suggest that neural sequences may emerge through learning from largely unstructured network architectures.
UR - http://www.scopus.com/inward/record.url?scp=84959861453&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84959861453&partnerID=8YFLogxK
U2 - 10.1016/j.neuron.2016.02.009
DO - 10.1016/j.neuron.2016.02.009
M3 - Article
C2 - 26971945
AN - SCOPUS:84959861453
SN - 0896-6273
VL - 90
SP - 128
EP - 142
JO - Neuron
JF - Neuron
IS - 1
ER -