Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

University

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Student Name: Nguyen Dinh Hoang

Student ID: 20021361

Information Theory (INT2044E)


Assignment 3

Question 1 Problem 4
Mutual information of heads and tails.

(a) Consider a fair coin flip. What is the mutual information between the top side and the
bottom side of the coin?

(b) A 6-sided fair dice is rolled. What is the mutual information between the top side and the
bottom side?

(c) What is the mutual information between the top side and the front face (the face most
facing you)?

(a) We have the chain rule: H(B) − H(B|T ) = I(T ; B) = log2 (2)

(b) Same with coin flip: H(B) − H(B|T ) = I(T ; B) = log2 (2)

(c) We use X represent for Top and Y for Front:

H(X) = log2 (6)


If the top is given, so the front will have four equally likely value:

H(X|Y = 1) = log2 (4)

and H(X|Y ) is the average of the H(X|Y = y) values; since they are all equally likely

H(X|Y ) = log2 (4) = 2

Lastly,
I(X; Y ) = H(X) − H(X|Y ) = log2 (6) − 2 = log2 (3) − 1

Question 2 Problem 7
Coin flip. A fair coin is flipped until the first head occurs. Let X denote the number of flips
required.
(a) Find the entropy H(X) in bits. The following expressions may be useful:
∞ ∞
X 1 X r
rn = , nrn =
n=0
1 − r n=0 (1 − r)2

1
If X = n that means Tail occurs n − 1 times and the last flip is Head.
 n−1
1 1
P (X = n) = ×
2 2
And then,
∞  n  n
X 1 1
H(X) = − log2 ( )
n=1
2 2
∞ ∞
−n
n2−n
X X
= (2 log2 (2n ) =
n=1 n=1
∞  n 1
X 1 2
= n = = 2.
n=1
2 (1 − 12 )2

You might also like