TY - JOUR
T1 - Mapping model units to visual neurons reveals population code for social behaviour
AU - Cowley, Benjamin R.
AU - Calhoun, Adam J.
AU - Rangarajan, Nivedita
AU - Ireland, Elise
AU - Turner, Maxwell H.
AU - Pillow, Jonathan W.
AU - Murthy, Mala
N1 - Publisher Copyright:
© The Author(s) 2024.
PY - 2024/5/30
Y1 - 2024/5/30
N2 - The rich variety of behaviours observed in animals arises through the interplay between sensory processing and motor control. To understand these sensorimotor transformations, it is useful to build models that predict not only neural responses to sensory input1–5 but also how each neuron causally contributes to behaviour6,7. Here we demonstrate a novel modelling approach to identify a one-to-one mapping between internal units in a deep neural network and real neurons by predicting the behavioural changes that arise from systematic perturbations of more than a dozen neuronal cell types. A key ingredient that we introduce is ‘knockout training’, which involves perturbing the network during training to match the perturbations of the real neurons during behavioural experiments. We apply this approach to model the sensorimotor transformations of Drosophila melanogaster males during a complex, visually guided social behaviour8–11. The visual projection neurons at the interface between the optic lobe and central brain form a set of discrete channels12, and prior work indicates that each channel encodes a specific visual feature to drive a particular behaviour13,14. Our model reaches a different conclusion: combinations of visual projection neurons, including those involved in non-social behaviours, drive male interactions with the female, forming a rich population code for behaviour. Overall, our framework consolidates behavioural effects elicited from various neural perturbations into a single, unified model, providing a map from stimulus to neuronal cell type to behaviour, and enabling future incorporation of wiring diagrams of the brain15 into the model.
AB - The rich variety of behaviours observed in animals arises through the interplay between sensory processing and motor control. To understand these sensorimotor transformations, it is useful to build models that predict not only neural responses to sensory input1–5 but also how each neuron causally contributes to behaviour6,7. Here we demonstrate a novel modelling approach to identify a one-to-one mapping between internal units in a deep neural network and real neurons by predicting the behavioural changes that arise from systematic perturbations of more than a dozen neuronal cell types. A key ingredient that we introduce is ‘knockout training’, which involves perturbing the network during training to match the perturbations of the real neurons during behavioural experiments. We apply this approach to model the sensorimotor transformations of Drosophila melanogaster males during a complex, visually guided social behaviour8–11. The visual projection neurons at the interface between the optic lobe and central brain form a set of discrete channels12, and prior work indicates that each channel encodes a specific visual feature to drive a particular behaviour13,14. Our model reaches a different conclusion: combinations of visual projection neurons, including those involved in non-social behaviours, drive male interactions with the female, forming a rich population code for behaviour. Overall, our framework consolidates behavioural effects elicited from various neural perturbations into a single, unified model, providing a map from stimulus to neuronal cell type to behaviour, and enabling future incorporation of wiring diagrams of the brain15 into the model.
UR - http://www.scopus.com/inward/record.url?scp=85193821405&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85193821405&partnerID=8YFLogxK
U2 - 10.1038/s41586-024-07451-8
DO - 10.1038/s41586-024-07451-8
M3 - Article
C2 - 38778103
AN - SCOPUS:85193821405
SN - 0028-0836
VL - 629
SP - 1100
EP - 1108
JO - Nature
JF - Nature
IS - 8014
ER -