LDPC
LDPC
LDPC
Ali Alsadi
Hamming Turbo
Practical codes
codes implementation
BCH codes Convolutional LDPC beats
of codes
codes Turbo and
Reed Solomon convolutional
codes codes
G can be found by
Gaussian elimination.
25
Tanner Graph
This graph is called a tanner graph H=1111011000
1
0011111100
0101011100
2
1010100111
1100101011
3
A 1, 2, 3, 4, 6, 7
4
B 3, 4, 5, 6, 7, 8
5
C
6 2, 4, 6, 8, 9, 10
D
7 1, 3, 5, 8, 9, 10
E
8 1, 2, 5, 7, 9, 10
28
Tanner Graph Terminology
The total number of operations at each node is
proportional to the number of nodes, or to the degree of
the node, and proportional to the number of iterations.
If the number of iteration is fixed and so as the degree of
the nodes is fixed, the total number of operations is linear
in the length of the code.
If we can decode this code simply by a sequence of
messages passing the left to the right and vice versa then
we can able to decode the code in linear complexity and
that is the beauty of LDPC.
The decoding complexity algorithm is proportional to the
number of edges in the tanner graph and hence is linear in
the block length of the code.
Information Theory and Coding
University of Arkansas at Little Rock 29
Tanner Graph Terminology
It is called ubiquities error correcting code which is popular and wildly
used.
It is called tanner graph after Michel Tanner how rediscovered this
code but he went further by adding some structures and he
introduced this graphical representation of the decoding algorithm.
The binary symmetric channel (BSC).
Xt = (-1)ut and Ut {0,1}
Yt = Xt Zt where Zt { 1}, pzt(+1) = 1- and pzt(-1) =
1-
Xt =1 Yt =1
Xt = -1 Yt = -1
Information Theory and Coding
University of Arkansas at Little Rock 30
Message passing Terminology
r (0) : m the initial message map
c (l) : mwc-1 m message passed from check to variable node
r (l) : x mwr-1 m message passed from variable to check node
v
m
m
m
Channel c m c
Input v v
m
c m
v
1 v
wc c
v
wc-1 v
38
Conclusion
The regular parity check matrix achieved better code rate
than the ordinary one and easer to decode.
The regular code is a special case of irregular code.
However, the irregular achieved better code rate than the
regular
Gallager proved that the probability of decoding error
approaches 0 with an increasing number of iterations for
sufficiently small cross over probabilities.