Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Unit I: Probability and Random Variables

Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

SRINIVASAN ENGINEERING COLLEGE-PERAMBALUR

DEPARTMENT OF ECE
PROBABILITY AND RANDOM PROCESSES
________________________________________________________

UNIT I: PROBABILITY AND RANDOM VARIABLES

PART A

1. From a bag containing 3 red and 2 black balls, two balls are drawn at
random. Find the probability that they are of the same color.
3C2  2C2 3  1 4 2
P (same colour) =   
5C2 5* 4 10 5
1* 2

1
2..When A and B are 2 mutually exclusive events such that P (A ) = and P
2
1
(B) = find P( A  B) and P( A  B) .
3
1 1 5
When A and B are mutually exclusive P( A  B) = P (A) + P (B) =  
2 3 6
P( A  B) = 0

3.Conditional probability:
The conditional probability of an event B, assuming that event A has
happened is denoted by P (B/A) defined as P(B/A) = n(A∩B) / n(A) = P(A∩B) /
P(A), Provided P(A) ≠ 0

4.There are 3 unbiased coins and 1 biased coin with head on both sides.
A coin is chosen at random and tossed 4 times. If head occurs all the 4
times what is the probability that the biased coin has been chosen?

Let A 1 be the event of choosing a coin which is biased and A 2 be the event of
choosing a unbiased coin. P(A1) = ¼ P (A2) = ¾ . Let B be the event of getting 4
heads when the randomly chosen coin 4 times. P(B/A1 ) = 1 P
(B/A2) = 4C4 (1/2)4 (1/2)0 = 1/16
P(B ) = P(A1) P(B/A1 ) + P(A2) P(B/A2 ) = ¼ .1 + ¾ . 1/16 = 0.3125

5. Given the probability density function of a continuous random variable X


as follows f(x) = 6x (1-x) 0<x<1 . Find cumulative density function.
x x x
CDF F(x) = ∫ f(x) dx = ∫ 6x (1-x) dx = 6(x /2 – x3 /3) = 3x2 – 2 x3 ,0 < x < 1
2
0 0 0

6. A continuous random variable X can assume any value between x = 2 and


x = 5 has a density function given by f(x) = k(x + 1). Find P(X < 4)
5 5 5
∫ f(x) dx = 1 ⇒ ∫ k(x + 1) dx = k( x /2 + x) ⇒ k(27/2) = 1 ⇒ k = 2/27.
2

2 2 2

4 4 4 4
2
P(X < 4) = ∫f(x) dx = ∫ k(x + 1)dx = 2/27∫(x+1) dx = 2/27 (x /2 + x ) = 16/27
2 2 2 2

7. Four persons are chosen at random from a group containing 3 men , 2


women and 4 children. Show that the chance that exactly two of them will be
children is 10/21
Let A be the event of selecting 2 children 1 woman and 1 man.
n (A) = 4 C 2 . 2 C 13 C 1
Let B be the event of selecting 2 children 2 men
n (B) = 4 C 2 . 3 C 2
Let C be the event of selecting 2 children 2 women
n (C) = 4 C 2 . 2 C 2
P(Exactly 2 children) = P (A∪B∪C) = P(A) + P(B) + P(C) = 10 / 21

8.If moment generating function M X (t) = 1/3 et + 4/15 e3t + 2/15 e4t + 4/15
e5t . Find the probability mass function of X.
X 1 2 3 4 5
P(X = x) 1/3 0 4/15 2/15 4/15

9. Suppose MX(t) = ( 0.4 e t + 0.6 ) 8 . Find the MGF of Y = 3X + 2


M Y (t) = e 2t M X (3t) = e 2t ( 0.4 e 3t + 0.6 ) 8

10. Define MGF. Why it is called so?


MFG is defined as M X (t) = E ( e tx ) , which generates the moments of the
random variable X, so it is called as Moment Generating function.

11. For a binomial distribution mean is 6 and standard deviation is √2.

Find the first two terms of the distribution.

Soln:

For a binomial distribution,


Mean= np = 6.
S.D. = √npq = √2,
npq=2

6 X q =2, q= 1/3. P=
1-q = 1- 1/3 = 2/3.
n = 9.

P(X=x) = nCx px q n-x, x = 0, 1, 2, ....,n

P(X=0) = 9C0 (2/3)0 (1/3)9 = 0.0005

P(X=1) = 9C1 (2/3) (1/3)8 = 0.0009.

12. Find p for a binomial variat, if n=6 and 9 P(X- 4) = P(X =2)

Soln:

P(X=x) = 6Cx px q 6-x, 9P(X = 4) = P(X = 2),

9 X 6C4 p4 q2 = 6C2 p2 q4

9 p2 = q2,

9 p2 (1-p)2, Since q= 1- p.

9p2 = 1 + p2 – 2p

8p2 + 2p -1 =0,

[ p+ (1/2)] [ p – (1/4)] =0, P =


¼.

13. Determine the distribution whose M.G.F is MX ( t ) = e3( e t – 1 ). Also

Find P(X=1).

W.K.T. M.G.F of a poisson distribution


MX ( t ) = eλ( e t – 1 )

Given MX ( t ) = e3( e t – 1 )
λ= 3
P(X= x) = e-λ λx / x!, x =0, 1, 2, ......

P(X= x) = e-3 3x / x!, x =0, 1, 2, ......


P(X= 1) = e-3 3 / 1!, x =0, 1, 2, ......
= 3e-3
14. Find the M.G.F of geometric distribution.

MX ( t ) = E ( e tx )

= Σ e tx. P(X=x)
x= 0

= Σ e tx. p qx
x= 0

= p Σ (e t q)x.
x= 0

= p [ 1 – q et ] -1

= p / ( 1- q et)

15. Identify the distribution with MGF MX(t) = ( 5 – 4 et ) -1

MX ( t ) = { (1/5) / [1- (4/5)et ] }

WKT, MGF of Geometric distribution, MX


( t ) = p / ( 1- q et )
∴ The Given MGF is the MGF of geometric distribution with parameter.

P= 1 / 5, q = 4/5.

P[X= x] = (1/5) (4/5) x-1, x= 1, 2, …….

16. Determine the distribution whose MGF is MX (t) = (1/3) et [e- t – (2/3)]-1
Soln:

The MX (t) of negative binomial distribution with parameters p and r is

MX (t) = pr [ 1 – q et] –r,

Given MX (t) = (1/3) [ e t e– t – (2/3) e t] –1,

= (1/3) [1– (2/3) e t] –1,

This is the M.G.F of Negative Binomial Distribution with parameters r = 1 and


p = 1/3.
17. Find the MGF of a uniform distribution in (a, b)
b
MX (t) = [1 / (b-a)] = ∫ etx x. b
= [1 / (b-a)] {etx / t} = [ebt - eat] / (b- a)t

18. Find the MGF of a R.V. which is uniformly distributed over (-1, 2)

Soln:

2
MX (t) = [1 /3] = ∫ etx dx. = [(e2t – e-t) / 3t], t ≠ 0
-1
2
MX (t) = [1 /3] = ∫ dx. =1, for t ≠ 0
-1
19. If X has uniform distribution in (-3, 3) find P[ |X - 2| < 2]

P.d.f is f(x) = 1/6, -3< X < 3


P[|X - 2| <2 ] = P [ 0< X < 4]

3
=(1 / 6) ∫ dx = 1 / 2.

PART B QUESTIONS

1. A random variable X has the following probability distribution

X 0 1 2 3 4 5 6 7
P(X) 0 K 2k 2k 3k k 2 2k2 7k2+k
Find (i) The value of k, (ii) P[ 1.5 < X < 4.5 / X >2 ] and (iii) The smallest value of λ
for which P(X ≤ λ) < (1/2).
2. A bag contains 5 balls and its not known how many of them are white. Two
balls are drawn at random from the bag and they are noted to be white. What
is the chance that the balls in the bag all are white.
1 x
3. Let the random variable X have the PDF f(x) = e 2 , x >0 Find the
2
moment generating function, mean and variance.
4. A die is tossed until 6 appear. What is the probability that it must tossed
more than 4 times.
5. A man draws 3 balls from an urn containing 5 white and 7 black balls. He
gets Rs. 10 for each white ball and Rs 5 for each black ball. Find his
expectation.
6. In a certain binary communication channel, owing to noise, the probability
that a transmitted zero is received as zero is 0.95 and the probability that
a transmitted one is received as one is 0.9. If the probability that a zero is
transmitted is 0.4, find the probability that (i) a one is received (ii) a one
was transmitted given that one was received
7. Find the MGF and rth moment for the distribution whose PDF is f(x) = k e –x
, x >0. Find also standard deviation.
8. The first bag contains 3 white balls, 2 red balls and 4 black balls. Second
bag contains 2 white, 3 red and 5 black balls and third bag contains 3
white, 4 red and 2 black balls. One bag is chosen at random and from it 3
balls are drawn. Out of 3 balls, 2 balls are white and 1 is red. What are the
probabilities that they were taken from first bag, second bag and third bag.
9. A random variable X has the PDF f(x) = 2x, 0 < x < 1 find (i) P (X < ½) (ii) P
( ¼ < X < ½) (iii) P ( X > ¾ / X > ½ )
10. If the density function of a continuous random variable X is given by
ax 0≤x≤1
a 1≤x≤2
f(x) = 3a – ax 2≤x≤3
0 otherwise

(1) Find a (2) Find the cdf of X

11. If the the moments of a random variable X are defined by E ( X r ) = 0.6,


r = 1,2,.. Show that P (X =0 ) = 0.4 P ( X = 1) = 0.6, P ( X  2 ) = 0.

12. In a continuous distribution, the probability density is given by f(x) = kx


(2 – x) 0 < x < 2. Find k, mean , varilance and the distribution function.

13. The cumulative distribution function of a random variable X is given by


0, x<0
2
x, 0≤x≤½
3
F(x) = 1 - (3  x) 2 ½≤x≤3
25
1 x 3
Find the pdf of X and evaluate P ( |X| ≤ 1 ) using both the pdf and cdf

14. Find the moment generating function of the geometric random variable
with the pdf f(x) = p q x-1, x = 1,2,3.. and hence find its mean and
variance.

15. A box contains 5 red and 4 white balls. A ball from the box is taken our
at random and kept outside. If once again a ball is drawn from the box,
what is the probability that the drawn ball is red?
16. A discrete random variable X has moment generating function
5
1 3 t
M X(t) =   e  Find E(x), Var(X) and P (X=2)
4 4 
17. The pdf of the samples of the amplitude of speech wave foem is found
to decay exponentially at rate , so the following pdf is proposed f(x) =
Ce  | x| , -  < X < . Find C, E(x)

18. Find the MGF of a binomial distribution and hence find the mean and
variance.
. Find the recurrence relation of central moments for a binomial distribution.

19. The number of monthly breakdowns of a computer is a RV having a


poisson distribution with mean equal to 1.8. Find the probability that this
computer will function for a month (a) without a breakdown, (b) Wish only one
breakdown, (c) Wish at least one break down.
20. Find MGF and hence find mean and variance form of binomial distribution.

21. State and prove additive property of poisson random variable.

22. If X and Y are two independent poisson random variable, then show that
probability distribution of X given X+Y follows binomial distribution.
23. Find MGF and hence find mean and variance of a geometric distribution.

24. State and prove memory less property of a Geometric Distribution.

25. Find the mean and variance of a uniform distribution.


UNIT- II TWO DIMENSIONAL RANDOM VARIABLES

PATR-A

1.State the basic properties of joint distribution of (X,Y) where x and Y


are random variables.
F(x,y) = P [ X ≤ x, Y ≤ y] . F(-,) = 1; F(-,y )= Marginal distribution of
Y and F(x,) = Marginal distribution of X

2. The joint p.d.f. f(X,Y) is f(x,y)= 6e-2x-3y, x ≥ o, y≥0, find the


conditional density of Y given X.
Soln.

Marginal of X is f(x) = ∫ 6e-2x e-3y dy,
0
= 2e-2x, x>0
Conditiona of Y given x :
f(y/x) = f(x/y) / f(x)
= (6e-2x e-3y)/ 2e-2x
= 3 e-3x,y≥0.
3. State the basic properties of joint distribution of (x,y) when X and Y are
random variables.
Soln.
F(x,y) = P(X≤x, Y≤y)
F(∞ ∞) = 1
F(∞, y) = Marginal distribution of Y
F(x, ∞) = Marginal distribution of X.
F(-∞, y) = 0, F(x, -∞)=0.
4.If X has an exponential distribution with parameter  find the pdf of Y = log X

dx  e y
fY ( y )  f X ( x )  e e
y
,   y  
dy

5.The joint probability mass function of ( X,Y) is given by P(x,y) = k ( 2x + 3y)


x = 0,1,2 y = 1,2,3. Find the marginal probability distribution of X
X\Y 1 2 3
0 3k 6k 9k
1 5k 8k 11k
2 7k 10k 13k

∑ ∑ P(x,y) = 1 ⇒ 72 k = 1 ⇒ k = 1/ 72
Marginal distribution of X :

X 0 1 2
P(X = x) 18/72 24/72 30/72
6. If f(x,y) = k (1 – x – y), 0 < x,y < ½ . Find K.

½½ ½ 1/2
K ∫ ∫ ( 1 – x – y ) dx dy = 1 ⇒ K ∫ ( (1 – y)/2 – 1/8 ) dy = 1 ⇒ K ( 3y/8 – y2/4) ⇒
k/8=1
0 0 0 0
⇒K=8
7. If X and Y are independent random variables with variances 2 and 3. Find Var(
3X + 4Y)
Var( 3X + 4Y) = 32 Var(x) + 42 Var (y) = 9 * 2 + 16 * 3 = 66

8. If X and Y are independent random variables, find covariance between X + Y


and X – Y.
Cov ( X + Y, X – Y ) = E [ ( X + Y ) – E ( X + Y ) ) [ ( X – Y ) – E ( X – Y ) ]
= E [( X – E(X) + Y – E (Y)) ( X – E ( Y ) – ( Y – E (Y)) ]
= E [ (X – E(X) ) 2 – ( Y – E ( Y) ) 2
= Var X – Var Y
9. If X and Y have joint probability density function f(x,y) = x + y 0 < x,y < 1,
Check whether X and Y are independent
1 1
Marginal density function of X is f X(x) = ∫ f(x,y) dy = ∫(x + y) dy = x + ½ , 0 < x < 1
0 0
1 1
Marginal density function of Y is f Y(y) = ∫ f(x,y) dx= ∫(x + y) dy = y + ½ , 0 < y < 1
0 0

f X(x) . f Y(y) = (x + ½ ) . (y + ½ ) ≠
f(x,y) X and Y are not independent.

10.The joint pdf of two random variables X and Y is given by


f(x,y) = 1/8 x (x-y), 0 < x < 2, -x < y < x . Find f (y/x).
x
f X (x) = ∫ 1/8 x (x – y) dy = x 3 /4 ,0 < x < 2
-x

f( y/x) = f (x,y) / f X (x) = 1/8 x (x-y) / x3/4 = (x-y) / 2x2 , -x < y < x
11. The joint pdf of (X,Y) is f(x,y) = 6e-2x-3y, x ≥ 0, y ≥ 0. find the conditional
density of Y
given
X.
Soln. ∞
Marginal of X is f(x) = ∫ 6e-2x- 3ydy
0
= 2e-2x, x > 0.
Conditional of Y given X:

f(y/x) = f(x,y) / f(x)

= 6e-2x e- 3y / xe-2x.
= 3 e-3y, y ≥ 0.

12. State the basic properties of joint distribution of (X,Y) when X and Y are
random variable.
Soln.
F(x,y) = P(X ≤ x, Y ≤ y)
F(∞, ∞)= 1.
F(∞, y)= Marginal distribution of
Y. F(x,∞) = Marginal distribution
of X. F( -∞, y) =0, F(x, -∞) =0.

13. If 2 random variables have the joint density f(x1, x2) = x1x, 0 < x1 < 1, 0 <
x2 < 2. Find the probability that both random variables will take on values less
than 1.
Soln. 1 1
P[ x1 ≤1 , x2 ≤1] = ∫ ∫ x1x2 dx2
dx1
0 0
= ∫ x1( x2 / 2) dx1=1|4
14. If X and Y are independent random variables find covariance between X+Y
and X – Y.
Soln.
Cor(X+Y, X-Y) =E [ {(X+Y)-E(X+Y)} {(X-Y) – E(X-Y) ]
= E[ { X- E(X) +Y – E(Y)} { (X-E(Y) ) – (Y-E(Y))}]
= E[ {X-E(X)}2 – {Y – E(Y) }2
= Var X – Var Y.
15. If X and Y are independent random variable with variances 2 and 3, find Var
(3X +4Y).
Soln.
Var( 3X+ 4Y) = 32 Var(X) + 42 Var(Y)
= 9 X 2 + 16 X 3
= 66.
16. If the joint pdf of (X,Y) is given by f(x,y) = e – ( x+ y)
, x  0, y  0 Find E (XY)
  

  xye dxdy   xe x dx  ye  y dy  1


( x y )
E (XY) =
0 0 0 0

17.The regression lines between two ransom variables X and Y is given by 3x


+ 4y =10 and 3x + 4y = 12. Find the correlation between two regression lines.
3 1
3x + 4y =10  b yx =  . 3x + 4y = 12  b xy = 
4 3
3 1 1 1
r 2 =  . =  r =
4 3 4 2

18. Distinguish between correlation and regression

By correlation we mean the casual relationship between two or more variables.


By regression we mean the average relationship between two or more variables.

19. Why there are two regression lines?


Regression lines express the linear relationship between two variable X and Y.
Since any of them is taken as independent variable, we have two regression lines.

20. State central limit theorem?

If X1, X 2 … X n be a sequence of independent identically distributed R.Vs with E


(X i ) =  and var ( Xi ) =  2 for o = 1,2… and if S n = X 1 + X 2 +… X n, then under
certain general conditions, S n follows a normal distribution with mean n and
variance n 2 as n tends to 
21.If Y 1 and Y 2 are two independent random variables, then covariance
(Y 1 , Y 2) = 0. IS the converse of the above statement true? Justify your answer.

The converse not true. Consider X  N (0,1) and Y = X 2 since X  N (0,1). E (X) = 0;
E (X3) = E (XY) = 0 since all odd moments vanish. Cov (x, y) = 0 but X and Y are not
independent.
22. Find the value of k, if f(x,y) = k ( 1- x) ( 1- y) for 0 < x,y < 1 is to be a joint
density function.
  1 1

We know 
 
f ( x, y )dxdy  1    k (1  x)(1  y)dxdy  1
0 0

1 y
1 1
1
 x2 x2 y  1

k   x   xy 
0
2 2

0
dy  1  k 
0
2 
 y 
2
 dy  1 

 y y2 y2 
k     1
2 2 4 0

1
k   1  k  4
4

Part- B
1. If f (x, y) = x+y , 0< x <1, 0< y < 1
0 , Otherwise
Compute the correlation cp- efficient between X and Y.

2. The joint p.d.f of a two dimensional randm variable (X, Y) is given by f(x, y) =
(8 /9) xy, 1 ≤ x ≤ y ≤ 2 find the marginal density unctions of X and Y. Find also
the conditional density function of Y / X =x, and X / Y = y.

3. The joint probability density function of X and Y is given by f(x, y) = (x + y) /3,


0 ≤ x ≤1 & 0<y <2 obtain the egression of Y on X and of X on Y.
-2x1 – 3 x2,
4. If the joint p.d.f. of two random variable is given by f(x1, x2) = 6 e
x1 > 0, x2 >0. Find the probability that the first random variable will take on a
value between 1 and 2 and the second random variable will take on avalue
between 2 and 3. \also find the probability that the first random variable will take
on a value less than 2 and the second random variable will take on a value
greater than 2.

5. If two random variable have hoing p.d.f. f(x1, x2) = ( 2/ 3) (x1+ 2x2) 0< x1 <1,
0< x2 < 1
6. Find the value of k, if f(x,y) = k xy for 0 < x,y < 1 is to be a joint density
function. Find P(X + Y < 1 ) . Are X and Y independent.
7. If two random variable has joint p.d.f. f(x, y) = (6/5) (x +y2), 0 < x < 1 , 0< y
<1.Find P(0.2 < X < 0.5) and P( 0.4 < Y < 0.6)

8. Two random variable X and Y have p.d.f f(x, y) = x2 + ( xy / 3), 0 ≤ x ≤1,


0≤ y ≤ 2. Prove that X and Y are not independent. Find the conditional density
function

e

 x2  y 2 
9. X and Y are 2 random variable joint p.d.f. f(x, y) = 4xy , x ,y ≥ 0,
2 2
find the p. d. f. of x +y .
10. Two random variable X and Y have joint f(x y) 2 – x –y, 0< x <1, 0< y < 1.
Find the Marginal probability density function of X and Y. Also find the conditional
density unction and covariance between X and Y.

11. Let X and Y be two random variables each taking three values –1, 0
and 1 and having the joint p.d.f.

X
Y
-1 0 1 Prove that X and Y have different
-1 0 0.1 0.1 expections. Also Prove that X and Y are
0 0.2 0.2 0.2 uncorrelated and find Var X and Var Y
1 0 0.1 0.1

12. 20 dice are thrown. Find the approximate probability tat the sum obtained is
between 65 and 75 using central limit theorem.

13. Examine whether the variables X and Y are independent whose joint density
–xy – x, 0< x , y < ∞
is f(x ,y) = x e
.
14. Let X and Y be independent standard normal random variables. Find the pdf
of z =X / Y.

15. Let X and Y be independent uniform random variables over (0,1) . Find the
PDF of Z = X + Y
UNIT-III CLASSIFICATION OF RANDOM PROCESS

1. Define a Random process and give an example of a random process.


A random process is a collection of random variables  X (s, t ) that are function
of time s  S and t T .

Example:
x(t )  A cos(t   )

Where  = uniform distributed in (0, 2)

A,  = constants.
2. State the four types of a stochastical processes.
Discrete time, discrete state
Discrete time, continuous state

Continuous time, discrete state


Continuous time, continuous state

3. If  X ( s, t ) is a random process, what is tha nature of X  s, t  when (i) S is


fixed (ii) t is fixed.
(i) When S is fixed, X  s, t  is a time function.
(ii) When t is fixed, X  s, t  is a random variable
4. What is discrete Random sequence? Give an example.
If both state space S and parameter set T are discrete, then the random
process is called a discrete random sequence. If X n represents the outcome of
the nth toss of a fair die, then  X n , n  1 is discrete random sequence.
5. Define discrete random process. Give example
If T is continuous and S is discrete, the random process is called a
discrete random process.
Example:

If X(t) represents the number of telephone calls received in the interval


(o, t), then  X (t ) is a discrete random process.
6. What is a continuous random process. Give example.
If both S and T are continuous, the random process is called a
continuous random process.
Example: If X(t) represents the maximum temperature at a place in the
interval (o, t),  X (t ) is a continuous random process.
7. What do you mean by the Mean and Variance of a random process?
If X (t ) is a representative members function of the random process
 X (t ) , [ X (t )] and Var[ X (t )] are called the Mean and Variance of the
process

8. When are two random processes said to be orthogonal?


Two random processes  X (t ) Y (t ) are said to be orthogonal. If
[ X (t1 )Y (t2 )]  0
9. Define the Auto correlation and Auto ovariance of Random process
 X (t )
Autocorrelation Function

RXX (t1 , t2 )  [ X (t1 ) X (t2 )]


Auto covariance

CXX (t1 , t2 )  RXX (t1 , t2 )  [ X (t1 )][ X (t2 )]

10. Define the Cross Correlation of two random process.


RXY (t1 , t2 )  [ X (t1 )Y (t2 )]
11. Define Cross Covariance of two random processes
RXY (t1 , t2 )  RXY (t1 , t2 )  [ X (t1 )][Y (t2 )]

12. Define a Stationary Process


If certain probability distribution or averages do not depend on time t,
then the random process  X (t ) is called stationary process.
13. Define Strict Sense Stationary Process.
A random process  X (t ) is a SSS process. If the joint distribution of
X (t1 ), X (t2 ),...... X (tn ) is the same as that of
X (t1  h), X (t2  h),......X (tn  h) for all t1 , t2 ,......tn and h > 0 and for all n  1

Example: Bernoulli’s Process.


14. Define Second order Stationary process
A random process  X (t ) is said to be second order SSS, if
f ( x1 , x2 , t1 , t2 )  f ( x1 , x2 , t1  h, t2  h) where f ( x1 , x2 , t1 , t2 ) is the
joint p.d.f { X (t1 ), X (t2 )}

15. Define Wide Sense Stationary Process


A random process  X (t ) is called wide sense stationary. If [ X (t )]
is a constant and [ X (t ) X (t   )]  RXX ( z )
(i.e) A.C.F. is a function of Z only.
16. Define evolutionary process and give example.
A random process  X (t ) that is not stationary in any sense is called
an evolutionary process.
Example: Poisson process.

17. When are the processes  X (t ) and Y (t ) said to be jointly stationary


in the Wide Sense?
Two random process  X (t ) and Y (t ) said to be jointly stationary in
the Wide sense if each process is individually a WSS process and (t1 , t2 ) is a
function of (t1 , t2 ) only.

18. State any four properties of Auto Correlation Function


(1) RXX (Z )  RXX ( Z )

(2)  R( Z )  R(0)

(3) R( Z ) is continuous for all Z

(4) If R( Z ) is A.C.F. of a stationary R.P.  X (t ) with no periodic components,


then

 X 2  lim R(Z )
z 

19. Give that the Auto correlation function for a stationary ergodic process
4
with no periodic components is R ( Z )  25  . Find the Mean and
1  6Z 2
variance of the Process  X (t )
Soln:
 X 2  lim R( Z )  25
z 

5
[ X 2 (t )]  RXX (0)  25  4
 29
Var[ X (t )]  E[ X 2 (t )]   2 [ X (t )]
 29  25
4

20. Find the Mean and Variance of the stationary process  X (t ) whose
25Z 2  36
A.C.F. R( Z ) 
6.25Z 2  4
Soln:

 X 2  lim R(Z )
z 

25  36 / Z 2
 lim
T  6.25  4 / Z 2

=4

X  2
E[ X 2 (t )]  RXX (0)
9

V [ X (t )]  9.4
5
21. A stationary random process [X(t)] with mean 4 has ACF R(Z) = 16 + 9
e  z . Find the standard deviation of the process
[ X 2 (t )]  RXX (o)
 16  9  25
V [ X (t )]  25  16  9
Standard deviation =3

22. Define the Cross- Correlation function and state any two of its properties.
Soln:
RXY (t1 , t2 )  [ X (t1 ) X (t2 )]
(1) RXY ( Z )  RXX ( Z )
(2)  RXY   RXX (0) RYY (0)
1
 [ RXX (0)  RYY (0)]
2
23. When is a random process said to be ergodic? Give example?
A random process  X (t ) is ergodic if its ensembled averages equal to
appropriate time averages.

Example: X (t )  A cos(t   )   uniformly distributed in (0, 2) is Mean


ergodic.
24. Distinguish between stationary and ergodicity.
Stationary of a random process is the property of the process by which
certain probability distributions or averages do not depend on time t.
Ergodocity of a random process is the property by which almost every member
of the process exhibits the same statistical behavior as the whole process.

25. Check for the stationary of the random process X (t )  A cos(t   ) if A


and 0 are constants and  is a uniformly distributed RV in (0, 2)
A2
[ X (t )]  0 and RXX (t , t  z )  cos 0 z
2
 [ X (t )] is wide sense stationary.

26. Give that R.P. { X (t )}  A cos 0t  B sin 0t where 0 is constant and
A and B are uncorrelated zero mean random variables having different
densities but the same variance. Is { X (t )} wide sense stationary?
[ X (t )]  0
[ X 2 (t )]   2
and R XX (t1 , t2 )   2 cos 0 (t1  t2 )

X (t ) is Wide sense stationary.

27. Define Markov Process.


If the future behaviour of a Process depends only the present state but
not on the past, the process is a Markov process.
28. Define Markov chain
A discrete parameter Markov process is called a Markov chain
(or)If,forall
P[ X n  an / X n1  an1 , X n2  an2 ,...... X 0  a0 ]  P[ X n  an / X n1  an1 ]
then the process { X n } , n= 0, 1, 2, ……. Is called Markov Chain
(a1 , a2 ,......an ,...) called the states of the Markov Chain.

29. Define A Markov Process.


If for t1  t2  ......  tn  t ,

P[ X (t )  x / X (t1 )  x1 , X (t2 )  x2 ,......... X (tn )  xn ]  P[ X (t )  x / X (tn )  xn ]


Then the process  X (t ) is called Markov Process.

30. What is a Stochastic matrix? When is it said to be regular.


If Pij  0 and 
Pij  1 for all I, then the t.p.m. P  ( Pij  of a
Markovchain is a stochastic matrix P. A stochastic matrix P is said to be regular
matrix, if all the entries of Pm (for some +ve integer m) are positive.

31. Define irreducible Markov chain. Also state Chapman – Kolmogrov


Theorem.
If Pij   for some n and for all i and j, then every state can be
(n)

reached from every other state. When this condition is satisfied, the Markov Chain
is said to be irreducible
Theorem:
(n)
If P is the t.p.m. of a homogeneous Markov chain, then n step t.p.m. P is
n
equal to p .

32. What is a Markov chain? When can you say that a Markov chain is
homogeneous?
A discrete state stochastic process  X (t ) / t T  is a Markov Chain.
If
[ X (t )  x / X (tn )  xn , X (tn1 )  xn1 ,......... X (t0 )  x0 ]  P[ X (t )  x / X (tn )  xn ]
If it has the property of invariance with respect to the time origin.
(i.e) P[ X (t )  x / X (tn )  xn ]  P[ X (t  tn )  x / X (0)  xn ] ,
then the Markov Chain is homogeneous

33. Consider the random process { X (t )}  cos(0t   ) where  is


uniformly distributed in the interval (-, ). Check whether X(t) is
stationary or not?

1
[ X (t )] 
2  cos( t   )d

0

1
 [sin(0t   )  sin(0t   )]
2
1
 [ sin 0t  sin 0t ]
2
0
1   2sin(0t   ) 
[ X 2 (t )]   
4  2 
1

2
The process is stationary as the first and second moments are independent of
time.
34. The one- step t.p.m. of a Markov Chain with states 0 and 1 is given as
0 1
P  . Draw the transition diagram. Is it irreducible Markov chain?
1 0

Yes, it is irreducible, because each state can be reached from any other
state.
35. Three boys A, B, C are throwing a ball to each other. A always throw the
ball to B and B always throws the ball to C but C is just as likely to throw
the ball B as to A. Find the transition Matrix and show that the process is
Markovian.

TPM of  X (t ) is given by P
Xn
 0 1 0
 
P  X n 1  0 0 1
 
1/ 2 1/ 2 0 

Since X n depends only on X n 1 but not on states of X n2 , X n3 ,......{ X n } is


a Markov process.
36. What do you mean by an absorbing Markov chain? Give an example.
A state i of a Markov chain is said to be an absorbing state if Pij =1.(i.e.)
if it is impossible to leave it. A markov chain is said to be absorbing if it has atleast
one absorbing state.
The TPM of an absorbing Markov Chain is
1 2 3 4
1 1 0 0 0 
 
2 1/ 2 0 1/ 2 0 
tpm 
3  0 1/ 2 0 1/ 2 
 
4 0 0 0 1 

 0 1 0
 
37. Prove that the matrix P   0 0 1  is the t.p.m. of an irreducible
 
1/ 2 1/ 2 0 
Markov chain.
 0 0 1
 
P 2  1/ 2 1/ 2 0 
 
 0 1/ 2 1/ 2 
 1/ 2 1/ 2 0 
 
P  0
3
1/ 2 1/ 2 
 
1/ 4 1/ 4 1/ 2 

P11(3)  0, P13(2)  0, P21(2)  0, P22(2)  0, P33(2)  0 and for all other Pij (1)   ,
the chain is irreducible.
38. State the postulates of a poisson process.
Let X(t) = number of times an event A say occurred upto time ‘t’ so that
the sequence { X (t )} , t  0 forms a Poisson process with parameter .
(i) Events occurring in non- overlapping intervals are independent of
each other
(ii) P[ X (t) =1 for t in ( x , x + h ) = h + 0(h)]
(iii) P[ X(t) = 0 for t in ( x , x + h ) = 1- h + 0(h)]
(iv) P[ X(t) = 2 or more for t in ( x , x + h )] = 0]
(v)
39. State any two properties of Poisson Process.
(i) The Poisson process is a Markov Process.
(ii) Sum of two independent Poisson processes is a Poisson process.
(iii) The difference of two independent Poisson processes is not a
Poisson process.

40. What will be the super position of n independent Poisson processes with
respective averages rates 1 , 2 ,.....n ?
The super position of n independent Poisson processes with averages
rates1 , 2 ,.....n is another Poisson process with averages rate
1  2  .....  n
41. If the customers arrive at a counter in accordance with a Poisson
process with a mean rate of 2 per minute. Find the probability that the
interval between two consecutive arrivals is more than one minute?
The interval T between 2 consecutive arrivals follows an exponential

distribution with parameter  = 2, P[T  1]   2e2t  e2  0.135


1

42. Let X(t) be a Poisson process with rate . Find correlation function of X
(t).
43. For Z > 0,
[ X (t ). X (t  z )]  { X (t )[ X (t  z )  X (t )  X (t )]}
 [ X (t )][ X (t  z )  X (t )]  [ X 2 (t )]
  t ( z )   2 t 2   t
  2tz   2t 2  t
44. Show that the Poisson process is not Covariance stationary?
e   t ( t ) r
P[ X (t )]  , r  0,1, 2,....
r!
E[X(t)] = t  a constant. The process is not covariance stationary.

45. A bank receives on the average  = 6 bad checks per day, what are the
probabilities that it will receive (i) 4 bad checks on any given day, (ii) 10
bad checks over any two consecutive days?
e   t ( t ) n
P[ X (t )  n] 
n!
e6t (6t ) n
 , n  0,1, 2,....
n!
e6 (6) 4
(i ) P[ X (1)  4]   0.1338
4!
e 12 (12)10
(ii ) P[ X (2)  10]   0.1048
10!
46. Customers arrive a large store randomly at an average rate of 240 per
hour. Shat is the probability that during a two- minute interval no one will
arrive?
e 4t (4t ) n
P[ X (t )  n]  , n  0,1, 2,...
n!
240
since  = 4
60
P[ X (2)  0]  e 8  0.0003
47. The number of arrivals at the Reginal computer center at express service
counter between 12 noon and 3 pm has a poisson distribution with a
Mean of 1.2 perminute. Find the probability of no arrivals during a given
1- minute interval.
e(1.2)t {1.2t}n
P[ X (t )  n]  , n  0,1, 2,...
n!
P[ X (1)  0]  e 1.2  0.3012
48. Define Gaussian or normal process (or) when is a random process is
said to be normal?
A real valued R.P.  X (t ) is called a Gaussian process or normal
process if the random variables X (t1 ), X (t2 ),......... X (tn ) are jointly normal for
any n and for any set ti ' s

49. State the properties of Gaussian process.


(i) If a Gaussian process is wide sense stationary, it is also strict sense
stationary
(ii) If the member functions of a Gaussian process are uncorrelated, the
they are independent.
(iii) If the input  X (t ) of a linear system is a Gaussian process, the
output will also be a Gaussian process.
50. Name a few random processes that are defined interms of stationary
Gaussian random process.
(i) Square law detector process
(ii) Full- Wave linear detector process
(iii) Half- Wave linear detector process
(iv) Hard limiter process.

51. If  X (t ) is a Gaussian process with  (t )  10 and


C (t1 , t2 )  16et1 t2  Find P{ X (10)  8}
[ X (10)]  10 and Var[ X(10)]=C(10,10)=16
 X (10)  10 
P[ X (10)  8]  P   0.5
 4 
 P[ z  0.5]
 0.5  P[ z  0.5]
 0.5  0.1915
 0.3085

52. Given a normal process  X (t ) with zero mean and RXX ( z )  4e2  z  .
Find Var[X(t)].
Var[ X (t )]  RXX ( z )  2 [ X (t )]
 4e2  z 

53. If X(t) is a normal process with C (t1 , t2 )  4e 0.5  z  , what is the variance
of X(5).
Var[ X (t )]  C (5,5)  4e0.5(0)
 4, since z= t1  t2

 
54. If X (t ) is a normal process with  (t )  3 and C (t1  t2 )  4e 0.2  t t
1 2 

find the variance of X(8) – X(5)


Var[ X (8)  X (5)]  var[ X (8)  X (5)]  2 cov[ X (8), X (5)]
 4  4  2 X4 X e 0.6
 8[1  e 0.6 ]
 3.6095
PART B

1. The process { X(t) } whose proabability distribution is given by


 at 
n 1

P [ X(t) = n] = , n  1, 2...
1  at 
n 1

=
 at  , n  0
1  at Show that it is not stationary

2. A raining process is considered as a two state Markov chain. If it rains, it is


considered to be in state 0 and if it does not rain, the chain is in state 1. the
 0.6 0.4 
transitioin probability of the markov chain is defined as P    . Find the
 0.2 0.8 
probability of the Markov chain is defined as today assuming that it is raining
today. Find also the unconditional probability that it will rain after three days
with the initial probabilities of state 0 and state 1 as 0.4 and 0.6 respectively.
3. Let X(t) be a Poisson process with arrival rate . Find E {( X (t) – X (s)2 } for t >
s.
4. Let { Xn ; n = 1,2..} be a Markov chain on the space S = { 1,2,3} with on step
 0 1 0 
transition probability matrix P  1/ 2 0 1/ 2  (1) Sketch transition diagram (2)
 1 0 0 
 
Is the chain irreducible? Explain. (3) Is the chain Ergodic? Explain
5. Consider a random process X(t) defined by X(t) = U cost + (V+1) sint, where U
and V are independent random variables for which E (U ) = E(V) = 0 ; E (U 2)
= E ( V2 ) = 1 (1) Find the auto covariance function of X (t) (2) IS X (t) wide
sense stationary? Explain your answer.
6. Discuss the pure birth process and hence obtain its probabilities, mean and
variance.
7. At the receiver of an AM radio, the received signal contains a cosine carrier
signal at the carrier frequency  with a random phase  that is uniform
distributed over ( 0,2). The received carrier signal is X (t) = A cos(t +  ).
Show that the process is second order stationary
8. Assume that a computer system is in any one of the three states busy, idle and
under repair respectively denoted by 0,1,2. observing its state at 2 pm each
 0.6 0.2 0.2 
 
day, we get the transition probability matrix as P   0.1 0.8 0.1  . Find out the
 0.6 0 0.4 
 
rd
3 step transition probability matrix. Determine the limiting probabilities.
9. Given a random variable  with density f () and another random variable 
uniformly distributed in (-, ) and independent of  and X (t) = a cos (t + ),
Prove that { X (t)} is a WSS Process.
10. A man either drives a car or catches a train to go to office each day. He never
goes 2 days in a row by train but if he drives one day, then the next day he is
just as likely to drive again as he is to travel by train. Now suppose that on the
first day of the week, the man tossed a fair die and drove to work iff a 6
appeared. Find (1) the probability that he takes a train on the 3rd day. (2) the
probability that he drives to work in the long run.
11. Show that the process X (t) = A cost + B sin t (where A and B are random
variables) is wide sense stationary, if (1) E (A ) = E(B) = 0 (2) E(A 2) = E (B2 )
and E(AB) = 0
12. Find probability distribution of Poisson process.
13. Prove that sum of two Poisson processes is again a Poisson process.
14. Write classifications of random processes with example

You might also like