Image Compression: Sankalp Kallakuri
Image Compression: Sankalp Kallakuri
Lecture 7
Sankalp Kallakuri
elsanky@gmail.com
Fundamentals
MxN image will need MNLavg number of pixels which may be less
than the Number of pixels decided by the number of grey levels as per
an m bit binary code.
CEACDABABCEABADACADABABADEACBADABCADAEE
Frequency to
Symbol
appear
A 15
B 7
C 6
D 6
E 5
Example Variable Length Code
Bit requirement for fixed length code
A 15 000 15*3=45
B 7 001 7*3=21
C 6 010 6*3=18
D 6 011 6*3=18
E 5 100 5*3=15
Total 117bits
Example Variable Length Code
Huffman code
Symbol Frequency to appear Total bits
code length
A 15 0 1 15*1=15
B 7 100 3 7*3=21
C 6 101 3 6*3=18
D 6 110 3 6*3=18
E 5 111 3 5*3=15
Total
87bits
The two images shall have the same histogram but different
autocorrelation functions.
Interpixel Dependency
1 2 3 4 5
• Objective Criteria
1) Root Mean Square Error
2) Mean Square Signal to Noise Ratio
• Subjective
{-3,-2,-1,0,1,2,3}
{much worse, worse, slightly worse, the same, slightly
better, better, much better}
{excellent, fine, passable, marginal, inferior, unusable}
Image Compression Models
f(x,y) F(x,y)
Source Channel Channel Channel Source
Encoder Encoder Decoder Decoder
Source encoder
f(x,y) channel
Mapper Quantizer Symbol
encoder
Source decoder
I – Information P - probability
P(a ) 1
j 1
j
Channel matrix
Image with 8 bits per pixel . One among 28x4x8 possible output images.
Huffman Code
.The first step involves ordering all the symbols in the probability of
their occurrence.
.The two symbols with the lowest probabilities are merged to one
composite symbol.
. The step is repeated until there are only two symbols left.
Huffman Coding
Original Source Source Reduction
Symbol Probability 1 2 3 4
a2 0.4 0.4 0.4 0.4 0.6
a6 0.3 0.1 0.3 0.3 0.4
a1 0.1 0.1 0.2 0.3
a4 0.1 0.1 0.1
a3 0.06 0.1
a5 0.04
• Truncated Huffman
• B2-Code
• Shift Code
• Binary Shift
• Huffman shift