TY - JOUR
T1 - Neural layered min-sum decoding for protograph LDPC codes
AU - Zhang, Dexin
AU - Dai, Jincheng
AU - Tan, Kailin
AU - Niu, Kai
AU - Chen, Mingzhe
AU - Poor, H. Vincent
AU - Cui, Shuguang
N1 - Funding Information:
This work was supported in part by the National Natural Science Foundation of China under Grants 92067202 and 62001049, in part by the China Post-Doctoral Science Foundation under Grant 2019M660032, in part by the U.S. National Science Foundation under Grant CCF-1908308, in part by the National Key R&D Program of China under Grant 2018YFB1800800, in part by the Key Area R&D Program of Guangdong Province under Grant 2018B030338001, in part by Shenzhen Outstanding Talents Training Fund, and in part by Guangdong Research Project under Grant 2017ZT07X152. (Jincheng Dai and Dexin Zhang are the co-first-authors.)
Publisher Copyright:
© 2021 IEEE
PY - 2021
Y1 - 2021
N2 - In this paper, layered min-sum (MS) iterative decoding is formulated as a customized neural network following the sequential scheduling of check node (CN) updates. By virtue of the lifting structure of protograph low-density parity-check (LDPC) codes, identical network parameters are shared among all derived edges originating from the same edge in the protograph, which makes the number of learnable parameters manageable. The proposed neural layered MS decoder can support arbitrary codelengths consequently. Moreover, an iteration-wise greedy training method is proposed to tune the parameters such that it avoids the vanishing gradient problem and accelerates the decoding convergence.
AB - In this paper, layered min-sum (MS) iterative decoding is formulated as a customized neural network following the sequential scheduling of check node (CN) updates. By virtue of the lifting structure of protograph low-density parity-check (LDPC) codes, identical network parameters are shared among all derived edges originating from the same edge in the protograph, which makes the number of learnable parameters manageable. The proposed neural layered MS decoder can support arbitrary codelengths consequently. Moreover, an iteration-wise greedy training method is proposed to tune the parameters such that it avoids the vanishing gradient problem and accelerates the decoding convergence.
KW - Layered decoding
KW - Min-sum (MS)
KW - Neural network
KW - Protograph LDPC codes
UR - http://www.scopus.com/inward/record.url?scp=85115094002&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85115094002&partnerID=8YFLogxK
U2 - 10.1109/ICASSP39728.2021.9414543
DO - 10.1109/ICASSP39728.2021.9414543
M3 - Conference article
AN - SCOPUS:85115094002
SN - 1520-6149
VL - 2021-June
SP - 4845
EP - 4849
JO - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
JF - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
T2 - 2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021
Y2 - 6 June 2021 through 11 June 2021
ER -