TY - JOUR
T1 - Finite-Dimensional bounds on ℤm and binary LDPC codes with belief propagation decoders
AU - Wang, Chih Chun
AU - Kulkarni, Sanjeev R.
AU - Poor, H. Vincent
N1 - Funding Information:
Manuscript received March 11, 2005; revised September 25, 2006. This work was supported by the Army Research Laboratory under Contract DAAD19-01-2-0011, by the Army Research Office under Contract DAAD19-00-1-0466, and by the National Science Foundation under Grant CCR-0312413. The material in this paper was presented in part at the IEEE International Symposium on Information Theory and Its Applications, Parma, Italy, October 2004.
Copyright:
Copyright 2008 Elsevier B.V., All rights reserved.
PY - 2007/1
Y1 - 2007/1
N2 - This paper focuses on finite-dimensional upper and lower bounds on decodable thresholds of ℤm and binary low-density parity-check (LDPC) codes, assuming belief propagation decoding on memoryless channels. A concrete framework is presented, admitting systematic searches for new bounds. Two noise measures are considered: the Bhattacharyya noise parameter and the soft bit value for a maximum a posteriori probability (MAP) decoder on the uncoded channel. For ℤm LDPC codes, an iterative m-dimensional bound is derived for m-ary-input/symmetric-output channels, which gives a sufficient stability condition for ℤm LDPC codes and is complemented by a matched necessary stability condition introduced herein. Applications to coded modulation and to codes with nonequiprobably distributed codewords are also discussed. For binary codes, two new lower bounds are provided for symmetric channels, including a two-dimensional iterative bound and a one-dimensional noniterative bound, the latter of which is the best known bound that is tight for binary-symmetric channels (BSCs), and is a strict improvement over the existing bound derived by the channel degradation argument. By adopting the reverse channel perspective, upper and lower bounds on the decodable Bhattacharyya noise parameter are derived for nonsymmetric channels, which coincides with the existing bound for symmetric channels.
AB - This paper focuses on finite-dimensional upper and lower bounds on decodable thresholds of ℤm and binary low-density parity-check (LDPC) codes, assuming belief propagation decoding on memoryless channels. A concrete framework is presented, admitting systematic searches for new bounds. Two noise measures are considered: the Bhattacharyya noise parameter and the soft bit value for a maximum a posteriori probability (MAP) decoder on the uncoded channel. For ℤm LDPC codes, an iterative m-dimensional bound is derived for m-ary-input/symmetric-output channels, which gives a sufficient stability condition for ℤm LDPC codes and is complemented by a matched necessary stability condition introduced herein. Applications to coded modulation and to codes with nonequiprobably distributed codewords are also discussed. For binary codes, two new lower bounds are provided for symmetric channels, including a two-dimensional iterative bound and a one-dimensional noniterative bound, the latter of which is the best known bound that is tight for binary-symmetric channels (BSCs), and is a strict improvement over the existing bound derived by the channel degradation argument. By adopting the reverse channel perspective, upper and lower bounds on the decodable Bhattacharyya noise parameter are derived for nonsymmetric channels, which coincides with the existing bound for symmetric channels.
KW - Belief propagation (BP) algorithm
KW - Bhattacharyya noise parameter
KW - Information combining
KW - Iterative decoding
KW - Low-density parity-check (LDPC) codes
KW - Memoryless channels
KW - Nonsymmetric channels
KW - ℤ alphabet
UR - http://www.scopus.com/inward/record.url?scp=33846075628&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=33846075628&partnerID=8YFLogxK
U2 - 10.1109/TIT.2006.887085
DO - 10.1109/TIT.2006.887085
M3 - Article
AN - SCOPUS:33846075628
SN - 0018-9448
VL - 53
SP - 56
EP - 81
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 1
ER -