Unit I: Probability and Random Variables
Unit I: Probability and Random Variables
Unit I: Probability and Random Variables
DEPARTMENT OF ECE
PROBABILITY AND RANDOM PROCESSES
________________________________________________________
PART A
1. From a bag containing 3 red and 2 black balls, two balls are drawn at
random. Find the probability that they are of the same color.
3C2 2C2 3 1 4 2
P (same colour) =
5C2 5* 4 10 5
1* 2
1
2..When A and B are 2 mutually exclusive events such that P (A ) = and P
2
1
(B) = find P( A B) and P( A B) .
3
1 1 5
When A and B are mutually exclusive P( A B) = P (A) + P (B) =
2 3 6
P( A B) = 0
3.Conditional probability:
The conditional probability of an event B, assuming that event A has
happened is denoted by P (B/A) defined as P(B/A) = n(A∩B) / n(A) = P(A∩B) /
P(A), Provided P(A) ≠ 0
4.There are 3 unbiased coins and 1 biased coin with head on both sides.
A coin is chosen at random and tossed 4 times. If head occurs all the 4
times what is the probability that the biased coin has been chosen?
Let A 1 be the event of choosing a coin which is biased and A 2 be the event of
choosing a unbiased coin. P(A1) = ¼ P (A2) = ¾ . Let B be the event of getting 4
heads when the randomly chosen coin 4 times. P(B/A1 ) = 1 P
(B/A2) = 4C4 (1/2)4 (1/2)0 = 1/16
P(B ) = P(A1) P(B/A1 ) + P(A2) P(B/A2 ) = ¼ .1 + ¾ . 1/16 = 0.3125
2 2 2
4 4 4 4
2
P(X < 4) = ∫f(x) dx = ∫ k(x + 1)dx = 2/27∫(x+1) dx = 2/27 (x /2 + x ) = 16/27
2 2 2 2
8.If moment generating function M X (t) = 1/3 et + 4/15 e3t + 2/15 e4t + 4/15
e5t . Find the probability mass function of X.
X 1 2 3 4 5
P(X = x) 1/3 0 4/15 2/15 4/15
Soln:
6 X q =2, q= 1/3. P=
1-q = 1- 1/3 = 2/3.
n = 9.
12. Find p for a binomial variat, if n=6 and 9 P(X- 4) = P(X =2)
Soln:
9 X 6C4 p4 q2 = 6C2 p2 q4
9 p2 = q2,
9 p2 (1-p)2, Since q= 1- p.
9p2 = 1 + p2 – 2p
8p2 + 2p -1 =0,
Find P(X=1).
Given MX ( t ) = e3( e t – 1 )
λ= 3
P(X= x) = e-λ λx / x!, x =0, 1, 2, ......
MX ( t ) = E ( e tx )
∞
= Σ e tx. P(X=x)
x= 0
∞
= Σ e tx. p qx
x= 0
∞
= p Σ (e t q)x.
x= 0
= p [ 1 – q et ] -1
= p / ( 1- q et)
P= 1 / 5, q = 4/5.
16. Determine the distribution whose MGF is MX (t) = (1/3) et [e- t – (2/3)]-1
Soln:
18. Find the MGF of a R.V. which is uniformly distributed over (-1, 2)
Soln:
2
MX (t) = [1 /3] = ∫ etx dx. = [(e2t – e-t) / 3t], t ≠ 0
-1
2
MX (t) = [1 /3] = ∫ dx. =1, for t ≠ 0
-1
19. If X has uniform distribution in (-3, 3) find P[ |X - 2| < 2]
3
=(1 / 6) ∫ dx = 1 / 2.
PART B QUESTIONS
X 0 1 2 3 4 5 6 7
P(X) 0 K 2k 2k 3k k 2 2k2 7k2+k
Find (i) The value of k, (ii) P[ 1.5 < X < 4.5 / X >2 ] and (iii) The smallest value of λ
for which P(X ≤ λ) < (1/2).
2. A bag contains 5 balls and its not known how many of them are white. Two
balls are drawn at random from the bag and they are noted to be white. What
is the chance that the balls in the bag all are white.
1 x
3. Let the random variable X have the PDF f(x) = e 2 , x >0 Find the
2
moment generating function, mean and variance.
4. A die is tossed until 6 appear. What is the probability that it must tossed
more than 4 times.
5. A man draws 3 balls from an urn containing 5 white and 7 black balls. He
gets Rs. 10 for each white ball and Rs 5 for each black ball. Find his
expectation.
6. In a certain binary communication channel, owing to noise, the probability
that a transmitted zero is received as zero is 0.95 and the probability that
a transmitted one is received as one is 0.9. If the probability that a zero is
transmitted is 0.4, find the probability that (i) a one is received (ii) a one
was transmitted given that one was received
7. Find the MGF and rth moment for the distribution whose PDF is f(x) = k e –x
, x >0. Find also standard deviation.
8. The first bag contains 3 white balls, 2 red balls and 4 black balls. Second
bag contains 2 white, 3 red and 5 black balls and third bag contains 3
white, 4 red and 2 black balls. One bag is chosen at random and from it 3
balls are drawn. Out of 3 balls, 2 balls are white and 1 is red. What are the
probabilities that they were taken from first bag, second bag and third bag.
9. A random variable X has the PDF f(x) = 2x, 0 < x < 1 find (i) P (X < ½) (ii) P
( ¼ < X < ½) (iii) P ( X > ¾ / X > ½ )
10. If the density function of a continuous random variable X is given by
ax 0≤x≤1
a 1≤x≤2
f(x) = 3a – ax 2≤x≤3
0 otherwise
14. Find the moment generating function of the geometric random variable
with the pdf f(x) = p q x-1, x = 1,2,3.. and hence find its mean and
variance.
15. A box contains 5 red and 4 white balls. A ball from the box is taken our
at random and kept outside. If once again a ball is drawn from the box,
what is the probability that the drawn ball is red?
16. A discrete random variable X has moment generating function
5
1 3 t
M X(t) = e Find E(x), Var(X) and P (X=2)
4 4
17. The pdf of the samples of the amplitude of speech wave foem is found
to decay exponentially at rate , so the following pdf is proposed f(x) =
Ce | x| , - < X < . Find C, E(x)
18. Find the MGF of a binomial distribution and hence find the mean and
variance.
. Find the recurrence relation of central moments for a binomial distribution.
22. If X and Y are two independent poisson random variable, then show that
probability distribution of X given X+Y follows binomial distribution.
23. Find MGF and hence find mean and variance of a geometric distribution.
PATR-A
dx e y
fY ( y ) f X ( x ) e e
y
, y
dy
∑ ∑ P(x,y) = 1 ⇒ 72 k = 1 ⇒ k = 1/ 72
Marginal distribution of X :
X 0 1 2
P(X = x) 18/72 24/72 30/72
6. If f(x,y) = k (1 – x – y), 0 < x,y < ½ . Find K.
½½ ½ 1/2
K ∫ ∫ ( 1 – x – y ) dx dy = 1 ⇒ K ∫ ( (1 – y)/2 – 1/8 ) dy = 1 ⇒ K ( 3y/8 – y2/4) ⇒
k/8=1
0 0 0 0
⇒K=8
7. If X and Y are independent random variables with variances 2 and 3. Find Var(
3X + 4Y)
Var( 3X + 4Y) = 32 Var(x) + 42 Var (y) = 9 * 2 + 16 * 3 = 66
f X(x) . f Y(y) = (x + ½ ) . (y + ½ ) ≠
f(x,y) X and Y are not independent.
f( y/x) = f (x,y) / f X (x) = 1/8 x (x-y) / x3/4 = (x-y) / 2x2 , -x < y < x
11. The joint pdf of (X,Y) is f(x,y) = 6e-2x-3y, x ≥ 0, y ≥ 0. find the conditional
density of Y
given
X.
Soln. ∞
Marginal of X is f(x) = ∫ 6e-2x- 3ydy
0
= 2e-2x, x > 0.
Conditional of Y given X:
= 6e-2x e- 3y / xe-2x.
= 3 e-3y, y ≥ 0.
12. State the basic properties of joint distribution of (X,Y) when X and Y are
random variable.
Soln.
F(x,y) = P(X ≤ x, Y ≤ y)
F(∞, ∞)= 1.
F(∞, y)= Marginal distribution of
Y. F(x,∞) = Marginal distribution
of X. F( -∞, y) =0, F(x, -∞) =0.
13. If 2 random variables have the joint density f(x1, x2) = x1x, 0 < x1 < 1, 0 <
x2 < 2. Find the probability that both random variables will take on values less
than 1.
Soln. 1 1
P[ x1 ≤1 , x2 ≤1] = ∫ ∫ x1x2 dx2
dx1
0 0
= ∫ x1( x2 / 2) dx1=1|4
14. If X and Y are independent random variables find covariance between X+Y
and X – Y.
Soln.
Cor(X+Y, X-Y) =E [ {(X+Y)-E(X+Y)} {(X-Y) – E(X-Y) ]
= E[ { X- E(X) +Y – E(Y)} { (X-E(Y) ) – (Y-E(Y))}]
= E[ {X-E(X)}2 – {Y – E(Y) }2
= Var X – Var Y.
15. If X and Y are independent random variable with variances 2 and 3, find Var
(3X +4Y).
Soln.
Var( 3X+ 4Y) = 32 Var(X) + 42 Var(Y)
= 9 X 2 + 16 X 3
= 66.
16. If the joint pdf of (X,Y) is given by f(x,y) = e – ( x+ y)
, x 0, y 0 Find E (XY)
The converse not true. Consider X N (0,1) and Y = X 2 since X N (0,1). E (X) = 0;
E (X3) = E (XY) = 0 since all odd moments vanish. Cov (x, y) = 0 but X and Y are not
independent.
22. Find the value of k, if f(x,y) = k ( 1- x) ( 1- y) for 0 < x,y < 1 is to be a joint
density function.
1 1
We know
f ( x, y )dxdy 1 k (1 x)(1 y)dxdy 1
0 0
1 y
1 1
1
x2 x2 y 1
k x xy
0
2 2
0
dy 1 k
0
2
y
2
dy 1
y y2 y2
k 1
2 2 4 0
1
k 1 k 4
4
Part- B
1. If f (x, y) = x+y , 0< x <1, 0< y < 1
0 , Otherwise
Compute the correlation cp- efficient between X and Y.
2. The joint p.d.f of a two dimensional randm variable (X, Y) is given by f(x, y) =
(8 /9) xy, 1 ≤ x ≤ y ≤ 2 find the marginal density unctions of X and Y. Find also
the conditional density function of Y / X =x, and X / Y = y.
5. If two random variable have hoing p.d.f. f(x1, x2) = ( 2/ 3) (x1+ 2x2) 0< x1 <1,
0< x2 < 1
6. Find the value of k, if f(x,y) = k xy for 0 < x,y < 1 is to be a joint density
function. Find P(X + Y < 1 ) . Are X and Y independent.
7. If two random variable has joint p.d.f. f(x, y) = (6/5) (x +y2), 0 < x < 1 , 0< y
<1.Find P(0.2 < X < 0.5) and P( 0.4 < Y < 0.6)
e
x2 y 2
9. X and Y are 2 random variable joint p.d.f. f(x, y) = 4xy , x ,y ≥ 0,
2 2
find the p. d. f. of x +y .
10. Two random variable X and Y have joint f(x y) 2 – x –y, 0< x <1, 0< y < 1.
Find the Marginal probability density function of X and Y. Also find the conditional
density unction and covariance between X and Y.
11. Let X and Y be two random variables each taking three values –1, 0
and 1 and having the joint p.d.f.
X
Y
-1 0 1 Prove that X and Y have different
-1 0 0.1 0.1 expections. Also Prove that X and Y are
0 0.2 0.2 0.2 uncorrelated and find Var X and Var Y
1 0 0.1 0.1
12. 20 dice are thrown. Find the approximate probability tat the sum obtained is
between 65 and 75 using central limit theorem.
13. Examine whether the variables X and Y are independent whose joint density
–xy – x, 0< x , y < ∞
is f(x ,y) = x e
.
14. Let X and Y be independent standard normal random variables. Find the pdf
of z =X / Y.
15. Let X and Y be independent uniform random variables over (0,1) . Find the
PDF of Z = X + Y
UNIT-III CLASSIFICATION OF RANDOM PROCESS
Example:
x(t ) A cos(t )
A, = constants.
2. State the four types of a stochastical processes.
Discrete time, discrete state
Discrete time, continuous state
(2) R( Z ) R(0)
X 2 lim R(Z )
z
19. Give that the Auto correlation function for a stationary ergodic process
4
with no periodic components is R ( Z ) 25 . Find the Mean and
1 6Z 2
variance of the Process X (t )
Soln:
X 2 lim R( Z ) 25
z
5
[ X 2 (t )] RXX (0) 25 4
29
Var[ X (t )] E[ X 2 (t )] 2 [ X (t )]
29 25
4
20. Find the Mean and Variance of the stationary process X (t ) whose
25Z 2 36
A.C.F. R( Z )
6.25Z 2 4
Soln:
X 2 lim R(Z )
z
25 36 / Z 2
lim
T 6.25 4 / Z 2
=4
X 2
E[ X 2 (t )] RXX (0)
9
V [ X (t )] 9.4
5
21. A stationary random process [X(t)] with mean 4 has ACF R(Z) = 16 + 9
e z . Find the standard deviation of the process
[ X 2 (t )] RXX (o)
16 9 25
V [ X (t )] 25 16 9
Standard deviation =3
22. Define the Cross- Correlation function and state any two of its properties.
Soln:
RXY (t1 , t2 ) [ X (t1 ) X (t2 )]
(1) RXY ( Z ) RXX ( Z )
(2) RXY RXX (0) RYY (0)
1
[ RXX (0) RYY (0)]
2
23. When is a random process said to be ergodic? Give example?
A random process X (t ) is ergodic if its ensembled averages equal to
appropriate time averages.
26. Give that R.P. { X (t )} A cos 0t B sin 0t where 0 is constant and
A and B are uncorrelated zero mean random variables having different
densities but the same variance. Is { X (t )} wide sense stationary?
[ X (t )] 0
[ X 2 (t )] 2
and R XX (t1 , t2 ) 2 cos 0 (t1 t2 )
reached from every other state. When this condition is satisfied, the Markov Chain
is said to be irreducible
Theorem:
(n)
If P is the t.p.m. of a homogeneous Markov chain, then n step t.p.m. P is
n
equal to p .
32. What is a Markov chain? When can you say that a Markov chain is
homogeneous?
A discrete state stochastic process X (t ) / t T is a Markov Chain.
If
[ X (t ) x / X (tn ) xn , X (tn1 ) xn1 ,......... X (t0 ) x0 ] P[ X (t ) x / X (tn ) xn ]
If it has the property of invariance with respect to the time origin.
(i.e) P[ X (t ) x / X (tn ) xn ] P[ X (t tn ) x / X (0) xn ] ,
then the Markov Chain is homogeneous
1
[sin(0t ) sin(0t )]
2
1
[ sin 0t sin 0t ]
2
0
1 2sin(0t )
[ X 2 (t )]
4 2
1
2
The process is stationary as the first and second moments are independent of
time.
34. The one- step t.p.m. of a Markov Chain with states 0 and 1 is given as
0 1
P . Draw the transition diagram. Is it irreducible Markov chain?
1 0
Yes, it is irreducible, because each state can be reached from any other
state.
35. Three boys A, B, C are throwing a ball to each other. A always throw the
ball to B and B always throws the ball to C but C is just as likely to throw
the ball B as to A. Find the transition Matrix and show that the process is
Markovian.
TPM of X (t ) is given by P
Xn
0 1 0
P X n 1 0 0 1
1/ 2 1/ 2 0
0 1 0
37. Prove that the matrix P 0 0 1 is the t.p.m. of an irreducible
1/ 2 1/ 2 0
Markov chain.
0 0 1
P 2 1/ 2 1/ 2 0
0 1/ 2 1/ 2
1/ 2 1/ 2 0
P 0
3
1/ 2 1/ 2
1/ 4 1/ 4 1/ 2
P11(3) 0, P13(2) 0, P21(2) 0, P22(2) 0, P33(2) 0 and for all other Pij (1) ,
the chain is irreducible.
38. State the postulates of a poisson process.
Let X(t) = number of times an event A say occurred upto time ‘t’ so that
the sequence { X (t )} , t 0 forms a Poisson process with parameter .
(i) Events occurring in non- overlapping intervals are independent of
each other
(ii) P[ X (t) =1 for t in ( x , x + h ) = h + 0(h)]
(iii) P[ X(t) = 0 for t in ( x , x + h ) = 1- h + 0(h)]
(iv) P[ X(t) = 2 or more for t in ( x , x + h )] = 0]
(v)
39. State any two properties of Poisson Process.
(i) The Poisson process is a Markov Process.
(ii) Sum of two independent Poisson processes is a Poisson process.
(iii) The difference of two independent Poisson processes is not a
Poisson process.
40. What will be the super position of n independent Poisson processes with
respective averages rates 1 , 2 ,.....n ?
The super position of n independent Poisson processes with averages
rates1 , 2 ,.....n is another Poisson process with averages rate
1 2 ..... n
41. If the customers arrive at a counter in accordance with a Poisson
process with a mean rate of 2 per minute. Find the probability that the
interval between two consecutive arrivals is more than one minute?
The interval T between 2 consecutive arrivals follows an exponential
42. Let X(t) be a Poisson process with rate . Find correlation function of X
(t).
43. For Z > 0,
[ X (t ). X (t z )] { X (t )[ X (t z ) X (t ) X (t )]}
[ X (t )][ X (t z ) X (t )] [ X 2 (t )]
t ( z ) 2 t 2 t
2tz 2t 2 t
44. Show that the Poisson process is not Covariance stationary?
e t ( t ) r
P[ X (t )] , r 0,1, 2,....
r!
E[X(t)] = t a constant. The process is not covariance stationary.
45. A bank receives on the average = 6 bad checks per day, what are the
probabilities that it will receive (i) 4 bad checks on any given day, (ii) 10
bad checks over any two consecutive days?
e t ( t ) n
P[ X (t ) n]
n!
e6t (6t ) n
, n 0,1, 2,....
n!
e6 (6) 4
(i ) P[ X (1) 4] 0.1338
4!
e 12 (12)10
(ii ) P[ X (2) 10] 0.1048
10!
46. Customers arrive a large store randomly at an average rate of 240 per
hour. Shat is the probability that during a two- minute interval no one will
arrive?
e 4t (4t ) n
P[ X (t ) n] , n 0,1, 2,...
n!
240
since = 4
60
P[ X (2) 0] e 8 0.0003
47. The number of arrivals at the Reginal computer center at express service
counter between 12 noon and 3 pm has a poisson distribution with a
Mean of 1.2 perminute. Find the probability of no arrivals during a given
1- minute interval.
e(1.2)t {1.2t}n
P[ X (t ) n] , n 0,1, 2,...
n!
P[ X (1) 0] e 1.2 0.3012
48. Define Gaussian or normal process (or) when is a random process is
said to be normal?
A real valued R.P. X (t ) is called a Gaussian process or normal
process if the random variables X (t1 ), X (t2 ),......... X (tn ) are jointly normal for
any n and for any set ti ' s
52. Given a normal process X (t ) with zero mean and RXX ( z ) 4e2 z .
Find Var[X(t)].
Var[ X (t )] RXX ( z ) 2 [ X (t )]
4e2 z
53. If X(t) is a normal process with C (t1 , t2 ) 4e 0.5 z , what is the variance
of X(5).
Var[ X (t )] C (5,5) 4e0.5(0)
4, since z= t1 t2
54. If X (t ) is a normal process with (t ) 3 and C (t1 t2 ) 4e 0.2 t t
1 2
P [ X(t) = n] = , n 1, 2...
1 at
n 1
=
at , n 0
1 at Show that it is not stationary