Professional Documents
Culture Documents
Entropy
1
H = ∑ p i ⋅ log 2
pi
i = 2, p1 = 1, p2 = 0 ⇒H =0
p1 = 1 2 , p2 = 1 2 ⇒ H = 1 bits / symbol
i = 4, p1 = p2 = p3 = p4 = 1 4 ⇒ H = 2 bits / symbol
Huffman Coding
Five steps
1. Find the gray-level probabilities for the image by
finding the histogram
2. Order the input probabilities (histogram
magnitudes)from smallest to largest
3. Combine the smallest two by addition
4. GOTO step2,until only two probabilities are left
5. By working backward along the tree,generate
code by alternating assignment of 0 and 1
Example
Number 40 20
of 30 g = 100 = 0.2
0
pixels 20 30
10 g = 100= 0.3
1
10
g = 100 = 0.1
2
Gray level 40
g = 100 = 0.4
3
a. Step 1:Histogram
Original Gray Level Probability Huffman code
(Natural Code)
g 0
00 0.2 010
g 1
01 0.3 00
g 2
10 0.1 011
g 3
11 0.4 1
3
Entropy = − ∑ p log( p )
i i
i =0
L −1
L ave
= ∑ l p
i= 0
i i
= 3 ( 0 .2 ) + 2 ( 0 .3 ) + 3 ( 0 .1 ) + 1( 0 .4 )
=1.9 bits/pixel
Example
Huffman Code
High probability data items are assigned short
codes
No code contains any other code as a prefix
Reading from the left-hand bit, each code is
uniquely decodable
Example: Huffman Coding
Claire motion vectors
Huffman Codes
To achieve optimum compression, a separate
code table is required for each of the two
sequences Carphone and Claire
Constrained Length
• Shortened Huffman code
7
7
7
7
7
7
7
7
Escape-
Escape-code + Fixed-
Fixed-length code
H.26L Universal VLC
Uniform
Regular structure