A class of doubly-generalized low-density parity-check (D-GLDPC) codes, where single parity-check (SPC) codes are used as variable nodes (VNs), is investigated. An expression for the growth rate of the weight distribution of any D-GLDPC... more
A class of doubly-generalized low-density parity-check (D-GLDPC) codes, where single parity-check (SPC) codes are used as variable nodes (VNs), is investigated. An expression for the growth rate of the weight distribution of any D-GLDPC ensemble with a uniform check node (CN) set is presented at first, together with an analytical technique for its efficient evaluation. These tools are then used for detailed analysis of a case study, namely, a rate-1/2 D-GLDPC ensemble where all the CNs are (7,4) Hamming codes and all the VNs are length-7 SPC codes. It is illustrated how the VN representations can heavily affect the code properties and how different VN representations can be combined within the same graph to enhance some of the code parameters. The analysis is conducted over the binary erasure channel. Interesting features of the new codes include the capability of achieving a good compromise between waterfall and error floor performance while preserving graphical regularity, and val...
This paper calculates new bounds on the size of the performance gap between random codes and the best possible codes. The first result shows that, for large block sizes, the ratio of the error probability of a random code to the... more
This paper calculates new bounds on the size of the performance gap between random codes and the best possible codes. The first result shows that, for large block sizes, the ratio of the error probability of a random code to the sphere-packing lower bound on the error probability of every code on the binary symmetric channel (BSC) is small for a wide range of useful crossover probabilities. Thus even far from capacity, random codes have nearly the same error performance as the best possible long codes. The paper also demonstrates that a small reduction k 0 ~ k in the number of information bits conveyed by a codeword will make the error performance of an (n; ~ k) random code better than the sphere-packing lower bound for an (n; k) code as long as the channel crossover probability is somewhat greater than a critical probability. For example, the sphere-packing lower bound for a long (n; k), rate 1=2, code will exceed the error probability of an (n; ~ k) random code if k0 ~ k?10 and th...
When several independent channels are coupled by a parity check constraint on their inputs, the mutual information between the input of one channel and the outputs of all other channels can be expressed as a combination of the mutual... more
When several independent channels are coupled by a parity check constraint on their inputs, the mutual information between the input of one channel and the outputs of all other channels can be expressed as a combination of the mutual information between the input and the output of each individual channel. This concept is denoted as information combining. For binary-input symmetric discrete memoryless channels, we present bounds on the combined information which are only based on the mutual information of the channels. Furthermore, we show that these bounds cannot be further improved. Exact expressions are provided for the case that all channels are binary symmetric channels and for the case that all channels are binary erasure channels.
We present two sequences of ensembles of non-systematic irregular repeat-accumulate codes which asymptotically (as their block length tends to infinity) achieve capacity on the binary erasure channel (BEC) with bounded complexity. This is... more
We present two sequences of ensembles of non-systematic irregular repeat-accumulate codes which asymptotically (as their block length tends to infinity) achieve capacity on the binary erasure channel (BEC) with bounded complexity. This is in contrast to all previous constructions of capacity-achieving sequences of ensembles whose complexity grows at least like the log of the inverse of the gap to capacity. The new bounded complexity result is achieved by allowing a su#cient number of state nodes in the Tanner graph representing the codes.
... AbstractA method for the asymptotic analysis of doubly-generalized low-density parity-check (D-GLDPC) codes on the binary erasure channel (BEC) is described. The proposed method is based on extrinsic information transfer (EXIT)... more
... AbstractA method for the asymptotic analysis of doubly-generalized low-density parity-check (D-GLDPC) codes on the binary erasure channel (BEC) is described. The proposed method is based on extrinsic information transfer (EXIT) charts. ...
Abstract: Consider communication over the binary erasure channel BEC using random low-density parity-check codes with finite-blocklength n fromstandard'ensembles. We show that large error events is conveniently described within a... more
Abstract: Consider communication over the binary erasure channel BEC using random low-density parity-check codes with finite-blocklength n fromstandard'ensembles. We show that large error events is conveniently described within a scaling theory, and explain how to estimate heuristically their effect. Among other quantities, we consider the finite length threshold e (n), defined by requiring a block error probability P_B= 1/2. For ensembles with minimum variable degree larger than two, the following expression is argued to hold e (n) ...
This paper investigates decoding of binary linear block codes over the binary erasure channel (BEC). Of the current iterative decoding algorithms on this channel, we review the Recovery Algorithm and the Guess Algorithm. We then present a... more
This paper investigates decoding of binary linear block codes over the binary erasure channel (BEC). Of the current iterative decoding algorithms on this channel, we review the Recovery Algorithm and the Guess Algorithm. We then present a Multi-Guess Algorithm extended from the Guess Algorithm and a new algorithm - the In-place Algorithm. The Multi-Guess Algorithm can push the limit to
The paper introduces ensembles of accumulate-repeat-accumulate (ARA) codes which asymptotically achieve capacity on the binary erasure channel (BEC) with {\em bounded complexity}, per information bit, of encoding and decoding. It also... more
The paper introduces ensembles of accumulate-repeat-accumulate (ARA) codes which asymptotically achieve capacity on the binary erasure channel (BEC) with {\em bounded complexity}, per information bit, of encoding and decoding. It also introduces symmetry properties which play a central role in the construction of capacity-achieving ensembles for the BEC with bounded complexity. The results here improve on the tradeoff between performance and complexity provided by previous constructions of capacity-achieving ensembles of codes defined on graphs. The superiority of ARA codes with moderate to large block length is exemplified by computer simulations which compare their performance with those of previously reported capacity-achieving ensembles of LDPC and IRA codes. The ARA codes also have the advantage of being systematic.