Abstract
A key issue in understanding the neural code for an ensemble of neurons is the nature and strength of correlations between neurons and how these correlations are related to the stimulus. The issue is complicated by the fact that there is not a single notion of independence or lack of correlation. We distinguish three kinds: (1) activity independence; (2) conditional independence; and (3) information independence. Each notion is related to an information measure: the information between cells, the information between cells given the stimulus, and the synergy of cells about the stimulus, respectively. We show that these measures form an interrelated framework for evaluating contributions of signal and noise correlations to the joint information conveyed about the stimulus and that at least two of the three measures must be calculated to characterize a population code. This framework is compared with others recently proposed in the literature. In addition, we distinguish questions about how information is encoded by a population of neurons from how that information can be decoded. Although information theory is natural and powerful for questions of encoding, it is not sufficient for characterizing the process of decoding. Decoding fundamentally requires an error measure that quantifies the importance of the deviations of estimated stimuli from actual stimuli. Because there is no a priori choice of error measure, questions about decoding cannot be put on the same level of generality as for encoding.
Original language | English (US) |
---|---|
Pages (from-to) | 11539-11553 |
Number of pages | 15 |
Journal | Journal of Neuroscience |
Volume | 23 |
Issue number | 37 |
DOIs | |
State | Published - Dec 17 2003 |
All Science Journal Classification (ASJC) codes
- General Neuroscience
Keywords
- Decoding
- Encoding
- Information theory
- Neural code
- Noise correlation
- Signal correlation