Parallel independent channels where no encoding is allowed for one of the channels are studied. The Slepian‐Wolf theorem on source coding of correlated sources is used to show that any information source whose entropy rate is below the sum of the capacity of the coded channel and the input/output mutual information of the uncoded channel is transmissible with arbitrary reliability. The converse is also shown. Thus, coding of the side information channel is unnecessary when its mutual information is maximized by the source distribution. Applications to superposed coded/uncoded transmission on Gaussian channels are studied and an information‐theoretic interpretation of Parallel‐Concatenated channel codes and, in particular. Turbo codes is put forth.
All Science Journal Classification (ASJC) codes
- Electrical and Electronic Engineering