Abstract
A popular way to measure the degree of dependence between two random objects is by their mutual information, defined as the divergence between the joint and product-of-marginal distributions. We investigate an alternative measure of dependence: the lautum information defined as the divergence between the product-of-marginal and joint distributions, i.e., swapping the arguments in the definition of mutual information. Some operational characterizations and properties are provided for this alternative measure of information.
Original language | English (US) |
---|---|
Pages (from-to) | 964-975 |
Number of pages | 12 |
Journal | IEEE Transactions on Information Theory |
Volume | 54 |
Issue number | 3 |
DOIs | |
State | Published - Mar 2008 |
All Science Journal Classification (ASJC) codes
- Information Systems
- Computer Science Applications
- Library and Information Sciences
Keywords
- Divergence
- Hypothesis testing
- Information measures
- Kelly gambling
- Mutual information