Abstract
The number of possible activity patterns in a population of neurons grows exponentially with the size of the population. Typical experiments explore only a tiny fraction of the large space of possible activity patterns in the case of populations with more than 10 or 20 neurons. It is thus impossible, in this undersampled regime, to estimate the probabilities with which most of the activity patterns occur. As a result, the corresponding entropy - which is a measure of the computational power of the neural population - cannot be estimated directly. We propose a simple scheme for estimating the entropy in the undersampled regime, which bounds its value from both below and above. The lower bound is the usual 'naive' entropy of the experimental frequencies. The upper bound results from a hybrid approximation of the entropy which makes use of the naive estimate, a maximum entropy fit, and a coverage adjustment. We apply our simple scheme to artificial data, in order to check their accuracy; we also compare its performance to those of several previously defined entropy estimators. We then apply it to actual measurements of neural activity in populations with up to 100 cells. Finally, we discuss the similarities and differences between the proposed simple estimation scheme and various earlier methods.
Original language | English (US) |
---|---|
Article number | P03015 |
Journal | Journal of Statistical Mechanics: Theory and Experiment |
Volume | 2013 |
Issue number | 3 |
DOIs | |
State | Published - Mar 2013 |
All Science Journal Classification (ASJC) codes
- Statistical and Nonlinear Physics
- Statistics and Probability
- Statistics, Probability and Uncertainty
Keywords
- computational biology
- neural code
- neuronal networks (theory)