Abstract
Entropy and information provide natural measures of correlation among elements in a network. We construct here the information theoretic analog of connected correlation functions: irreducible [Formula presented]-point correlation is measured by a decrease in entropy for the joint distribution of [Formula presented] variables relative to the maximum entropy allowed by all the observed [Formula presented] variable distributions. We calculate the “connected information” terms for several examples and show that it also enables the decomposition of the information that is carried by a population of elements about an outside source.
Original language | English (US) |
---|---|
Journal | Physical review letters |
Volume | 91 |
Issue number | 23 |
DOIs | |
State | Published - 2003 |
All Science Journal Classification (ASJC) codes
- General Physics and Astronomy