RVSP Notes
RVSP Notes
RANDOM VARIABLES
Introduction
Consider an experiment of throwing a coin twice. The outcomes {HH, HT, TH, TT}
consider the sample space. Each of these outcome can be associated with a number by
specifying a rule of association with a number by specifying a rule of association (eg. The
number of heads). Such a rule of association is called a random variable. We denote a
random variable by the capital letter (X, Y, etc) and any particular value of the random
variable by x and y.
Thus a random variable X can be considered as a function that maps all elements in
the sample space S into points on the real line. The notation X(S)=x means that x is the
value associated with the outcomes S by the Random variable X.
1.1 SAMPLE SPACE
Consider an experiment of throwing a coin twice. The outcomes S = {HH, HT, TH, TT}
constitute the sample space.
1.2 RANDOM VARIABLE
In this sample space each of these outcomes can be associated with a number by
specifying a rule of association. Such a rule of association is called a random variables.
Eg : Number of heads
We denote random variable by the letter (X, Y, etc) and any particular value of the
random variable by x or y.
S = {HH, HT, TH, TT} X(S)
= {2, 1, 1, 0}
Thus a random X can be the considered as a fun. That maps all elements in the sample space S
into points on the real line. The notation X(S) = x means that x is the value associated with
outcome s by the R.V.X.
Example
In the experiment of throwing a coin twice the sample space S is S = {HH,HT,TH,TT}.
Let X be a random variable chosen such that X(S) = x (the number of heads).
Note
Any random variable whose only possible values are 0 and 1 is called a Bernoulli random
variable.
1.2.1 DISCRETE RANDOM VARIABLE
Definition : A discrete random variable is a R.V.X whose possible values consitute finite set of
values or countably infinite set of values.
Examples
1
Here P(X1) = 4 means the probability of the R.V.X (the number of heads) is less than or equal
to 1 is 4 .
Distribution function of the random variable X or cumulative distribution of the random
variable X
Def :
The distribution function of a random variable X defined in (-, ) is given
by F(x) = P(X x) = P{s : X(s) x}
Note
Let the random variable X takes values x1, x2, .., xn with probabilities P1, P2, .., Pn
and let x1< x2< .. <xn
Then we have
F(x) = P(X < x1) = 0, - < x < x,
F(x) = P(X < x1) = 0, P(X < x1) + P(X = x1)
= 0 + p 1 = p1
F(x) = P(X < x2) = 0, P(X < x1) + P(X = x1) + P(X = x2) = p1 + p2
F(x) = P(X < xn) = P(X < x1) + P(X = x1) + .. + P(X = xn)
= p1 + p2+ . + pn
=1
1.2.2 PROPERTIES OF DISTRIBUTION FUNCTIONS
Property : 1
P(a < X b) = F(b) F(a), where F(x) = P(X x)
Property : 2
P(a X b) = P(X = a) + F(b) F(a)
Property : 3
P(a < X < b) = P(a < X b) - P(X = b)
= F(b) F(a) P(X = b)
by prob (1)
1.2.3 PROBABILITY MASS FUNCTION (OR) PROBABILITY FUNCTION
Let X be a one dimenstional discrete R.V. which
takes the values
x1, x2, To each possible outcome xi we can associate a number pi.
i.e., P(X = xi)
= P(xi) = pi
called the probability of xi. The number
pi = P(xi) satisfies the following conditions.
(i) p(xi ) 0, i
(ii) p(x i ) = 1
i =1
The function p(x) satisfying the above two conditions is called the probability mass
function (or) probability distribution of the R.V.X. The probability distribution {x i, pi} can be
displayed in the form of table as shown below.
X = xi
x1
x2
xi
P(X = xi) = pi
p1
p2
pi
Notation
Let S be
a sample space.
The set of all outcomes
X(S) = x is denoted by writing X = x.
P(X = x)
= P{S : X(s) = x}
= P{S : X() (-, a)}
|||ly P(x a)
= P{s : X(s) (a, b)}
and P(a < x b)
= P{(X = a) (X = b)}
P(X = a or X = b)
P(X = a and X = b) = P{(X = a) (X = b)}
and so on.
S in S such that
Theorem :1 If X1 and X2 are random variable and K is a constant then KX 1, X1 + X2, X1X2,
K1X1 + K2X2, X1-X2 are also random variables.
Theorem :2
If X is a random variable and f() is a continuous function, then f(X) is a random
variable.
Note
If F(x) is the distribution function of one dimensional random variable then
I. 0 F(x) 1
AI. If x < y, then F(x) F(y)
III. F(-) = lim F(x) = 0
x
Table 1
Values of X
0 1
p(x)
a 3a 5a 7a
4 5
9a 11a 13a
15a 17a
p(x i ) = 1
i =0
1/81 3/81
1
1
1/81
81
81
(iv) To find the distribution function of X using table 2, we get
X = x F(X) = P(x x)
0
1
2
3
4
F(0)
= p(0) = 1/81
F(1)
F(2)
F(3)
F(4)
F(5)
F(6)
F(7)
F(8)
p(8)
5
6
= f (x) dx
(or)
a
b
P(a X b)
= f (x) dx
a
(ii) f (x) dx = 1
Remark
5
1. In the case of discrete R.V. the probability at a point say at x = c is not zero. But in the case of
a continuous R.V.X the probability at a point is always zero.
P(X = c)
= f (x)dx = [x]C = C C = 0
x f (x)dx
Arithmetic mean
ii
Harmonic mean
a
b
iii
1
x
f (x) dx
log x f (x)dx
a
b
iv
xr f (x)dx
a
b
(x A)r f (x)dx
a
b
vi
(x mean)r f (x)dx
a
b
vii
(x mean)2 f (x)dx
Variance 2
a
b
viii
| x mean | f (x)dx
a
E(X) = x f (x)dx
It is denoted by
'r
= xr f (x)dx
Thus
6
'
= E(X)
'2
= E(X 2 )
Mean = X = 1' = E(X)
And
Variance
= '2 '22
Variance
= E(X2 ) [E(X)]2
th
* r moment (abut
mean) Now
(a)
E{X E(X)}r
{x E(X)}r f (x)dx
{x X}r f (x)dx
Thus
{x X}r f (x)dx
(b)
Where
E[X E(X)r ]
th
{x X}f (x)dx
= x f (x)dx
x f (x)dx
X X f (x)dx
X X
(x X)2 f (x)dx
Variance = 2
E[X E(X)]2
f (x)dx =1
E g (X) = E (K) =
K f (x)dx
K f (x)dx
f (x)dx =1
=
K.1
=
Thus E(K) = K E[a constant] = constant.
1.3.4 EXPECTATIONS (Discrete R.V.s)
Let X be a discrete random variable with P.M.F p(x)
Then
E(X) =
x p(x)
x
E(X r ) =
x r p(x)
(by def)
If we denote
E(Xr ) =
'r
Then
'r
E[X r ] =
x r p(x)
x
Put r = 1, we get
Mean 'r
x p(x)
Put r = 2, we get
'2
E[X 2 ] =
'2
= E(X ) {E(X)}
'
=
2
th
x 2p(x)
x
'r
E[{X E(X)}r ]
(x X)r p(x),
=
E(X) = X
Put r = 2, we get
Variance = 2 = (x X)2 p(x)
x
E(X)
X
E[log (x)] log E(X)
2
2
E(X ) [E(X)]
1.3.7 EXPECTATION OF A LINEAR COMBINATION OF RANDOM VARIABLES
Let X1, X2, , Xn be any n random variable and if a1, a2 , , an are constants, then
E[a1X1 + a2X2 + + anXn] = a1E(X1) + a2E(X2)+ + anE(Xn)
Result
If X is a random variable, then
2
Var (aX + b) = a Var(X) a and b are constants.
Covariance :
If X and Y are random variables, then covariance between them is defined as
Cov(X, Y) = E{[X - E(X)] [Y - E(Y)]}
= E{XY - XE(Y) E(X)Y + E(X)E(Y)}
Cov(X, Y)
=
E(XY) E(X) . E(Y)
(A)
If X and Y are independent, then
E(XY)
= E(X) E(Y)
Sub (B) in (A), we
get Cov (X, Y) = 0
If X and Y are independent, then
9
Cov (X, Y) = 0
Note
(i)
Cov(aX, bY)
= ab Cov(X, Y)
(ii)
Cov(X+a, Y+b)
= Cov(X, Y)
(iii)
Cov(aX+b, cY+d)
= ac Cov(X, Y)
(iv)
Var (X1 + X2) = Var(X1) + Var(X2) + 2 Cov(X1, X2)
If X1, X2 are independent
Var (X1+ X2) = Var(X1) + Var(X2)
EXPECTATION TABLE
Discrete R.Vs
Continuous R.Vs
1. E(X) = x p(x)
1. E(X) = x f (x) dx
r
'
2. E(X ) = r = x r f (x) dx
r
'
r
2. E(X ) = r = x p(x)
x
'
3. Mean = r = x p(x)
'
3. Mean = r =
x f (x) dx
4.
'
2
'
4. 2 =
= x 2 p(x)
x f (x) dx
'
'2
'
'2
5. Variance = = E(X ) {E(X)}
2
1/6
1/6
1/6
1/6
1/6
1/6
p(x1)
p(x2)
p(x3)
p(x4)
p(x5)
p(x6)
E(X) =
=
=
=
x i p(x i )
i =1
E(X) =
=
=
x i p(x p )
i =1
1 + 4 + 9 + 16 + 25 + 36
=
Variance (X)
x1 p(x 1)+x2 p(x2 )+x3 p(x 3)+x4 p(x 4)+x5 p(x 5)+x6p(x6) 1(1/6) + 4(1/6) +
9(1/6) + 16(1/6) + 25(1/6) + 36(1/6)
91
=
6
2
1 1
2
C
3 =1
C32=16
(2)
91
72
91
6
49
35
12
Example :2
Find the value of (i) C (ii) mean of the following distribution given
f (x) =
C(x x
),
0<x<1
otherwise
Solution
Given f (x) =
C(x x
),
0<x<1
otherwise
(1)
f (x) dx =1
C(x x 2 ) dx =1
0
2
=1
30
C =1
C=6
6
2
Sub (2) in (1), f(x) = 6(x x ), 0< x < 1
(2)
(3)
Mean
= E(x) = x f (x) dx
= x 6(x x 2 ) dx
0
= (6x
[from (3)]
x 3 ) dx
11
[ 0 < x < 1]
= 6x 6x
Mean =
Mean
x
b
(iii)
(iv)
F'(x) = dF(x)
dx
f(x) 0
(v)
P(X = xi) = F(xi) F(xi 1)
Example :1.4.1
Given the p.d.f. of a continuous random variable X follows
f (x) =
6x(1 x),
Solution
Given f (x) =
0<x<1
otherwise
6x(1 x),
0<x<1
0
x
otherwise
F(x)
= f (x) dx
= 0 dx
=0
F(x)
= f (x) dx
0x
= f (x) dx + f (x) dx
0
x
30
= 0 + 6x(1 x) dx = 6 x(1 x) dx = 6 x x
0
(iii)
= 3x 2x
When x > 1, then
F(x) =
f (x) dx
= 0dx + 6x(1 x) dx + 0 dx
0
1
2
= 6 (x x ) dx
=1
x<0
0,
F(x) = 3x 2 2x 3 ,
0<x<1
x >1
1,
Example:1.4.2
e x ,
(i) If f (x) = 0,
x 0
x< 0
(ii) If so determine the probability that the variate having this density will fall in the interval (1,
2).
Solution
x0x<0
e x ,
Given f (x) = 0,
-x
= 0dx + e
= e
dx
=e
+1
13
=1
Hence f(x) is a p.d.f
(ii) We know that
b
= f (x) dx
P(a X b)
a
2
= f (x) dx
P(1 X 2)
= e x dx = [ e x ]2+1
= e x dx = [ e x ]2+1
1
-2
= -e + e
-1
= -0.135 + 0.368
= 0.233
Example:1.4..3
-x
A probability curve y = f(x) has a range from 0 to . If f(x) = e , find the mean and
variance and the third moment about mean.
Solution
Mean
= x f (x) dx
0
= xe
dx
= x[ e ] [e
Mean = 1
= (x 1) 2 e x dx
0
2 =1
Third moment about mean
b
3 = (x Mean) 3 f (x) dx
a
Here a = 0, b =
= (x 1) 3 e x dx
a
= -1 + 3 -6 + 6 = 2 3 =
2
1.5 MOMENT GENERATING FUNCTION
Def : The moment generating function (MGF) of a random variable X (about origin) whose
probability function f(x) is given by
MX(t)
= E[etX]
14
tx
x=
Where t is real parameter and the integration or summation being extended to the entire range of
x.
Example :1.5.1
th
Proof
tX
2
3
1 + tX + (tX) + (tX) + .... + +
1!
2!
3!
(tX)r
+ ....
r!
t2
tr
r
= E[1] + t E(X) + 2! E(X 2 ) + ..... + r! E(X ) + ........
t2
M (t)
X
'
[using r = E(X r )]
th
t
t3
'
'
'
= 1 + t + + 3! + ..... + ' + ........
r! r
1 2!
2
3
tr
r!
t r '
r
r =0 r!
(t)
X
M (t)
X
tr
'
r! r
2
r
= ' + t ' + t ' + ..... + t '
r! r
0 1!
1 2!
2
r =0
(A)
M ' (t)
X
3
= ' + 2t ' + t ' + .....
2! 2 3! 3
1
(B)
M ' (0)
X
= ' = Mean
1
'
Mean = M (0)
1
(or)
d
dt
(M X (t))
15
t =0
M X" (t)
Put
t = 0 in (B)
M X " (0) = '2
In general 'r
(or)
2
d
dt
(M X (t))
t =0
(M X (t))
d
r
dt
t =0
Example :1.5.3
Obtain the MGF of X about the point X = a.
= E[e t (X a ) ]
Proof
The moment generating function of X about the point X = a is M X (t)
2
r
= E 1 + t(X a) + t (X a) 2 + .... + t (X a) r + ....
2!
r!
Formula
x
e
=1+
x x2
1! 2!
+
+ ...
t2
= E(1) + E[t(X a)] + E[
tr
(X a) 2 ] + .... + E[ r! (X a) r ] + ....
2!
r
= 1 + t E(X a) + t E(X a) 2 + .... + t E(X a) r + ....
r!
2!
2
[M X (t) ]x =a
Result:
=1+t' +
1
t2
2!
tr
' + .... + r! ' + ....
2
(1)
(2)
M CX (t) = M X (ct)
Example :1.5.4
If X1,
MX
+ X + ....+X
2
X2, .., Xn
(t)
= E[e
=
E[e
tX
t (X + X + ....+X )
.e
tX1
tX
tX
.....e n ]
tX2
Example:1.5.5
Prove that if
Proof
By definition
at
= X a , then M (t) = e h
.MX h ,
h
M (t) = E e
tu
M X (t) = E[e
tx
t X a
=E
tX ta
=Een
n
ta
tX
= E[ e ] E[ e h ]
ta
=e
tat
tX
E[ e h ]
[by def]
= e h . MX h
at
= e h .MX
M (t)
, where =
Xa
Example:1.5.6
Find the MGF for the distribution where
2
3
1
f (x) =
0
at x = 1
at x = 2
otherwise
Solution
f (1) = 2
3
Given
f (2) = 3
f(3) = f(4) = = 0
MGF of a R.V. X is given by
17
= E[e tx ]
M X (t)
=
0
e txf (x)
=0
2t
M X (t) = e tx p(x)
x =0
Let X be a random variable which follows binomial distribution then MGF about origin is
given by
E[e tX ]
M X (t) = e tx p(x)
x =0
n
e
=
=
tx
nC x p
n x
p(x) = nC x p
x =0
n
(e tx ) p x nC xqn x
x =0
n
(pe t ) x nC xqn x
x =0
18
n x
M X (t)
= (q + pe t )n
Example:1.6.2
Find the mean and variance of binomial distribution.
Solution
M X (t)
= (q + pe t )n
M 'X (t)
= n(q + pe t ) n 1 .pet
Put t = 0, we get
= n(q + p) n 1.p
M 'X (0)
X
'
(q + p) = 1
Mean = E(X) = np
"
n 1
Mean M (0)
t n 2
= np (q + pe ) .e + e (n 1)(q + pe )
M X (t)
.pe
Put t = 0, we get
"
M X (t)
n 1
n 2
= np (q + p) + (n 1)(q + p)
= np [1 + (n 1)p]
= np + n 2 p 2 np2
= n 2 p 2 + np(1 p)
"
M (0)
M "X (0)
.p
= n p + npq
= E(X 2 ) = n 2 p 2 + npq
1p=q
E[etX . e-tnp)]
-tnp
-tX)
= e . [-[e ]]
-tnp
t n
= e . (q+pe )-tp n
t n
= (e ) . (q + pe )
MGF about mean = (e-tp)n. (q + pet)n
Example :1.6.4
Additive property of binomial distribution.
Solution
19
Then
X
(t)
q + p et
MX+Y (t)
,
)
+ pe
= q
(t)
= M X (t).MY (t) (
(1
t n1
. q
= q
+ p et
)
n2
+ pe
t n
If MX (t) = q+pet
)n
(2)
(1)
np =
npq
q=
4
3
3
4
= 1 which is > 1.
Since q > 1 which is not possible (0 < q < 1). The given data not follow binomial distribution.
Example :1.6.5
The mean and SD of a binomial distribution are 5 and 2, determine the distribution.
Solution
Given
Mean = np = 5
(1)
SD =
(2)
(1)
p = 1
npq = 2
np
npq
=1
5 5
(2)
q= 4
5
p=
20
n = 25
The binomial distribution is
x n-x
= nCxp q
P(X = x) = p(x)
x
= 25Cx(1/5) (4/5)
n-x
x = 0, 1, 2, .., 25
ex
x!
= p(x) =
x = 0, 1, 2, ..,
= e tx p(x)
x =0
=
x
etx e
=0
x!
t x
= e ( e )
x =0
x!
t x
= e ( e ) x =0 x!
Hence
t 2
e 1 + e t + ( e ) 2!
+ ......
t
(e 1)
= e e e t =(e et 1)
MX(t) = e
M X '(t)
= e (e t 1)
M X (t)
WKT
= e (e t 1) .et
M X '(0)
' = E(X)
1
= e .
= x.p(x)
x =0
21
= x.
ex
x. e x 1
x!
x =0
+ e .
x!
x =0
x.x 1
=1
x!
= e .
x =1
x 1
(x 1)!
= e 1 + +
+ .....
2!
=e
.e
Mean =
'2
= E[X 2 ]
= x .p(x)
x =0
= {x(x 1)
x =0
+ x}.
x!
x =0
=e
x =0
x =0
= e 2
1+
=2+
Variance 2 = E(X 2 ) [E(X)]2
x!
x =0
x
x!
x.ex
x!
+
3)....1
x .
x 2
= e 2
x =0
x(x 1)e x
(x 2)(x
x 2
(x 2)!
2
+
+ ....
1! 2!
=2+2 =
Variance =
Hence Mean = Variance =
Note : * sum of independent Poisson Vairates is also Poisson variate.
PROBLEMS ON POISSON DISTRIBUTION
Example:1.7.1
Solution
P(X = x)
ex
x!
22
1
5 , find the P(X=0) and P(X=3).
=e
P(X=1)
= e =
3
= 10
(Given)
3
10
(1)
e 2 =
P(X=2)
2!
(Given)
e 2 = 1
2! 3 5
(2)
(1) e = 10
(3)
(2) e 2 = 2
(4)
5
(3) 1 = 3
(4)
4
=4
e0
P(X=0)
P(X=3)
3
e 3 e 4/3 (4 / 3)
3!
= 3! =
0!
= e4/3
Example :1.7.2
If X is a Poisson variable
P(X = 2)
= 9 P(X = 4) + 90 P(X=6)
Find
(i) Mean if X
(ii) Variance of X
Solution
x
P(X=x) =
Given
x!
, x = 0,1, 2,.....
P(X = 2) = 9 P(X = 4) + 90 P(X=6)
e2
e4
e6
2! = 9 4! + 90 6!
1 9 2 904
+ 6!
=
2
4!
1 3 2 4
+ 8
2=
1 = 38 2 +4
4
23
4 + 3 2 4 = 0
2
=1
= -4
or
= 2i
= 1
or
Mean = = 1, Variance = = 1
Standard Deviation = 1
1.7.3 Derive probability mass function of Poisson distribution as a limiting case of Binomial
distribution
Solution
We know that the Binomial distribution is
x n-x
P(X=x) = nCx p q
=
P(X=x)
When n
n!
p x (1 p)n x
(n x)! x!
1.2.3.....(n x) x!
p
= 1.2.3.......(n x)(n x +1)......n
1.2.3.....(n x) x!
1p
x
n(n 1)(n 2)......(n x + 1)
1
=
x!
1
11
=
x!
P(X=x)
x
n
n x
n x
n
lt 1 1
x! n
x1
n
x
...... 1
x1
x 1
...... 1 n
x!
2
1
11
x!
(1 p)n
1
n
1
lt
x! n
...... 1
lt
n n
...... lt
x 1
n
1
We know that
24
x 1
n
n x
lt 1
n
lt
and
n x
= e
n
1
lt
..... =
x 1
lt 1
n
P(X=x)
e , x = 0,1, 2,......
x!
1.8
GEOM Def: A discrete random variable X is said to
follow geometric distribution, if it assumes only
non-negative
values and
its probability
P
Where
q = 1-p mass
(
Example:1.8.1
To find MGF
M
= E[e
=
=
=1
p / q e tx qx
=p/q
t
(e
=1
p / q (e t q)x
Let x = e q
q)
+ (e
= p / q x + x 2+ x3
=1
t
q) 2 +(e t q) 3 + ....
+ ....
p x + x+ x2
p
= (1 x) 1
1
+ ....
q
q
p
t
t
t 1
= pe [1 qe ]
= q qe 1 qe
MX(t)
=1
t
= pe
1 qet
* To find the Mean & Variance
25
'
(1 qe t )pe t pe t ( qe t )
pet
M X (t) =
(1 qet )2
E(X) = M 'X (0) = 1/p
t 2
= (1 qe )
Mean = 1/p
pe
"X (t) = d
dt (1 qe )
t 2
t
t
t
t
(1
qe
)
pe
pe
2(1
qe
)(
qe
)
=
t 4
(1 qe )
t 2
t
t
qet (1 qet )
= (1 qe ) pe + 2pe
(1 qet )4
t
Variance
p p2
Var (X) = p2
Note:
Another form of geometric distribution
x
P[X=x] = q p ; x = 0, 1, 2, .
p
MX (t) = (1 qet )
Mean = q/p,
Variance = q/p
Example:1.8.2
-1
If the MGF of X is (5-4et) , find the distribution of X and P(X=5)
Solution
Let the geometric distribution be P(X
x
= x) = q p, x = 0, 1, 2, ..
The MGF of geometric distribution is given by
p
1 qet
(1)
t -1
5 1 1
Here MX(t) = (5 - 4e )
P(X = x) = pq ,
4
5
(2)
q = 4; p = 1
5
5
x = 0, 1, 2, 3, .
26
x
= 14 55
P(X = 5)
45
45
5
1
,a<x<b
ba
f (x) =
otherwise
0,
* To find MGF
= e txf (x)dx
M X (t)
= e
tx
1
dx
ba
1 etx a b
atb
1
bx
e
= (b a)t
e
at
at
M X (t) = e e
(b a)t
* To find Mean and Variance
= x f (x)dx
E(X)
= bx
dx =
ba
x
b
x dx =
a
ba
2
b
a
ba
2
= b a = b + a =a + b
2(b a)
2
2
27
' =a+b
Mean
Putting
'
r = 2 in (A), we get
b
=x
f (x)dx =
x2
ba
dx
2
2 + ab + b
a
=
3
'
= ' 2
Variance
=
Variance =
(b a)
12
b 2 + ab + b 2
b+a2
= (b a)2
12
f (x)
1
= 2
< x <
otherwise
f (x)dx = 1 / 3
1
2 dx = 1 / 3
1
1 (x ) = 1 / 3
1 ( 1) = 1 / 3
=3
(ii) P(|X| < 1) = P(|X| > 1) = 1 - P(|X| < 1)
P(|X| < 1) + P(|X| < 1) = 1
2 P(|X| < 1) = 1 2 P(-1 <
X < 1) = 1
1
2 f (x)dx = 1
1
28
1 2/
2/
dx = 1
= 2
Note:
1. The distribution function F(x) is given by
F(x)
< x <
xa
axb
ba
1
b<x<
F(x)
1
= 2a
a<x<a
otherwise
F(x)
x>a
otherwise
To find MGF
Solution
M X (t)
= e txf (x)dx
= e tx e xdx
0
( t )x
= ( t) e
MGF of x =
= e ( t )xdx
= t
t , > t
29
MX(t) =
1
=
= 1
t2
tr
t 1
= 1 + + 2 + ..... + r
t2
= 1+ + 2!
MX(t) =
t r t!
2! +..... +
r!
r =0
1
' = Coefficient of t = 1
Mean
t2
' = Coefficient of
2
2!
'
'2
Variance= =
2
1
Variance =
2
= 2
1!
2
1
1
2
2
= = 2
1
Mean =
Example: 1.10.1
Let X be a random variable with p.d.f
1
F(x)
3
0
e3
x>0
otherwise
Find
1) P(X > 3)
2) MGF of X
Solution
WKT the exponential distribution is
-x
F(x) = e , x > 0
Here = 1
e
P(x>3) = f (x) dx
3
P(X>3) = e
MGF is
= 3
3
-1
M X (t)
= t
dx
30
1
= 3
1
3 t
MX(t) =
Note
1
= 3
1 3t
3
1
1 3t
1
1 3t
=0, elsewhere
and
dx
=0, elsewhere
When is the parameter of the distribution.
Additive property of Gamma Variates
If X1,X2 , X3,.... Xk
are
independent gamma variates with parameters
1,2,.. krespectively then X1+X2 + X3+.... +Xk is also a gamma variates with parameter 1+ 2
+.. + k
.
Example :1.11.1
Customer demand for milk in a certain locality ,per month , is Known to be a
general Gamma RV.If the average demand is a liters and the most likely demand b liters (b<a) ,
what is the varience of the demand?
Solution :
Let X be represent the monthly Customer demand for milk.
Average demand is the value of E(X).
Most likely demand is the value of the mode of X or the value of X for which its density
function is maximum.
If f(x) is the its density function of X ,then
f(x) =
k-1 -x
.x
-x
, x>0
31
f(x) =
.[(k-1) x
k-2 -x
e-x ]
= 0 ,when x=0 , x=
.[(k-1) x
f (x) =
<0 , when x=
Therefour
k-2 -x
e-x ]
=b
.(1)
(2)
=
From (1) and (2)
TUTORIAL QUESTIONS
1.It is known that the probability of an item produced by a certain
machine will be defective is 0.05. If the produced items are sent to the
market in packets of 20, fine the no. of packets containing at least,
exactly and atmost 2 defective items in a consignment of 1000 packets
using (i) Binomial distribution (ii) Poisson approximation to binomial
distribution.
2. The daily consumption of milk in excess of 20,000 gallons is
approximately exponentially distributed with . 3000 = The city has a
daily stock of 35,000 gallons. What is the probability that of two days
selected at random, the stock is insufficient for both days.
3.The density function of a random variable X is given by f(x)= KX(2-X), 0X2.Find K, mean,
th
variance and r moment.
4.A binomial variable X satisfies the relation 9P(X=4)=P(X=2) when n=6. Find the parameter p
of the Binomial distribution.
5. Find the M.G.F for Poisson Distribution.
6. If X and Y are independent Poisson variates such that P(X=1)=P(X=2) and
P(Y=2)=P(Y=3). Find V(X-2Y).
7.A discrete random variable has the following probability distribution
X:
0
1
2
3
4
5
6
7
8
P(X) a
3a
5a
7a
9a
11a
13a
15a
17a
Find the value of a, P(X<3) and c.d.f of X.
32
6x(1 x),
0<x<1
otherwise
Solution
Given f (x) =
6x(1 x),
0<x<1
otherwise
F(x) =
f (x) dx
= 0 dx
=0
F(x)
= f (x) dx
0x
= f (x) dx + f (x) dx
0
x
30
= 0 + 6x(1 x) dx = 6 x(1 x) dx = 6 x x
0
(iii)
= 3x 2x
When x > 1, then
F(x)
= f (x) dx
= 0dx + 6x(1 x) dx + 0 dx
6 (x x 2 ) dx
=1
33
0,
F(x)
3x 2 2x 3 ,
=
x<0
0<x<1
x>1
1,
Example :2
A random variable X has the following probability function
Values of X
0 1 2 3 4 5
6
Probability P(X)
0 1 2 3 4 5
a 3a 5a 7a 9a 11a
6
7
8
13a 15a 17a
p(x i ) = 1
i =0
81
81
(iv) To find the distribution function of X using table 2, we get
34
X=x
F(X) = P(x x)
F(0)
= p(0) = 1/81
F(1)
= P(X 1) = p(0) + p(1)
= 1/81 + 3/81 = 4/81
F(2)
= P(X 2) = p(0) + p(1) + p(2)
= 4/81 + 5/81 = 9/81
F(3)
= P(X 3) = p(0) + p(1) + p(2) + p(3)
= 9/81 + 7/81 = 16/81
F(4)
= P(X 4) = p(0) + p(1) + . + p(4)
= 16/81 + 9/81 = 25/81
F(5)
= P(X 5) = p(0) + p(1) + .. + p(4) + p(5)
= 2/81 + 11/81 = 36/81
F(6)
= P(X 6) = p(0) + p(1) + .. + p(6)
= 36/81 + 13/81 = 49/81
F(7)
= P(X 7) = p(0) + p(1) + . + p(6) + p(7)
= 49/81 + 15/81 = 64/81
F(8)
= P(X 8) = p(0) + p(1) + .. + p(6) + p(7) + p(8)
= 64/81 + 17/81 = 81/81 = 1
Example :3
The mean and SD of a binomial distribution are 5 and 2, determine the distribution.
Solution
Given
Mean = np = 5
(1)
SD =
(2)
(1)
p = 1
4
5
npq = 2
np = 4 q = 4
npq 5
5
=1
5
p=
(2)
1
5
x = 0, 1, 2, .., 25
35
P(X = 2)
Find
Solution
P(X=x) =
Given
= 9 P(X = 4) + 90 P(X=6)
(i) Mean if X
(ii) Variance of X
ex , x = 0,1, 2,.....
x!
e4
2! = 9 4! + 90
1 9 2 904
+
=
2
4!
6!
2
4
1 3
+
=
2
8
8
2
3
4
+
1=
4
4
4 + 3 2 4 = 0
2
=1
e6
6!
= -4
or
= 2i
= 1
or
Mean = = 1, Variance = = 1
Standard Deviation = 1
36
UNIT II
TWO DIMENSIONAL RANDOM VARIABLES
Introduction
In the previous chapter we studied various aspects of the theory of a single R.V. In this
chapter we extend our theory to include two R.V's one for each coordinator axis X and Y
of the XY Plane.
DEFINITION : Let S be the sample space. Let X = X(S) & Y = Y(S) be two functions each
assigning a real number to each outcome s S. hen (X, Y) is a two dimensional random
variable.
2.1 Types of random variables
1. Discrete R.V.s
2. Continuous R.V.s
Discrete R.V.s (Two Dimensional Discrete R.V.s)
If the possible values of (X, Y) are finite, then (X, Y) is called a two dimensional discrete
R.V. and it can be represented by (xi, y), i = 1,2,.,m.
In the study of two dimensional discrete R.V.s we have the following 5 important terms.
The set {xi, pij / p.j}, i = 1, 2, 3, ..is called the conditional probability distribution of X
given Y = yj.
The conditional probability function of Y given X = xi is given by
P[Y = yi / X = xj] p ij P[Y =
yi / X = xj] = =
P[X = xj]
pi.
The set {yi, pij / pi.}, j = 1, 2, 3, ..is called the conditional probability distribution of Y
given X = xi.
SOLVED PROBLEMS ON MARGINAL DISTRIBUTION
Example:2.1.1
From the following joint distribution of X and Y find the marginal distributions.
X
0
1
2
Y
0
3/28
9/28
3/28
1
3/14
3/14
0
2
1/28
0
0
Solution
X
0
2
PY(y) = p(Y=y)
Y
0
3/28 P(0,0)
3/28 P(2,0) 15/28 = Py(0)
1
3/14 P(0, 1)
3/14 P(1,1) 6/14 = Py(1)
2
1/28 P(0,2)
0 P(2,2)
1/28 = Py(2)
10/28 = 5/14 3/28
PX(X) = P(X=x)
1
PX(0)
PX(2)
The marginal distribution of X
PX(0) = P(X = 0) = p(0,0) + p(0,1) + p(0,2) = 5/14
PX(1) = P(X = 1) = p(1,0) + p(1,1) + p(1,2) = 15/28
PX(2) = P(X = 2) = p(2,0) + p(2,1) + p(2,2) = 3/28
Marginal probability function of X is
PX (x) =
14 , x = 0
15
, x=1
28
3
, x=2
28
15
28 , y = 0
3
y=1
= ,
7
1
, y=2
28
PY (y)
Joint
probability
distribution
function F(x,y) = P[X x, Y y]
=
xy
f (x, y)dx dy
f (x)
(ii) P(X = x / Y = y) = f (x / y) =f (x, y) , f (y) > 0
f (y)
Example :2.3.1
Show
that
the
function
2
(2x + 3y), 0 < x < 1,
f (x, y) = 5
0
39
otherwise
0<y<1
(i) f (x, y) 0
2
Given
f (x, y) = 5
0<y<1
otherwise
11
00
=
=
11
500
21
+ 3xy
dy
0
2
(1 + 3y) dy = 5 y +
3y 2
50
2
=
3
1+ 2
2 5 =1
5 2
8xy,
0 < x < 1,
0,
otherwise
0<y<x
(iii) f(y/x)
f ( y / x)
= f (x, y)
f (x)
= 8xy = 2y , 0 < y < x, 0 < x < 1
4x 3
x2
40
Result
Marginal pdf g
Marginal pdf y
F(y/x)
2y
4x , 0<x<1
4y, 0<y<x
2.4 REGRESSION
* Line of regression
The line of regression of X on Y is given by
x x = r. x (y y)
The line of regression of Y on X is given by
y y = r. x (x x)
* Angle between two lines of Regression.
1 r2 y x
2+
tan = r
x
* Regression coefficient
Regression coefficients of Y on X
r. x = bYX
Regression coefficient of X and Y
r.
x
y = bXY
Correlation coefficient r =
b XY bYX
Example:2.4.1
1. From the following data, find
(i) The two regression equation
(ii) The coefficient of correlation between the marks in Economic and Statistics.
(iii) The most likely marks in statistics when marks in Economic are 30.
Marks in Economics
Marks in Statistics
Solution
25
40
28
46
35 32
49 41
31
36
36
32
29
31
38
30
34
33
32
39
41
MA6451
PROCESSES
X
25
28
35
32
31
36
29
38
34
32
320
Y
43
46
4
41
36
32
31
30
33
39
380
Here
X X = X 32 X Y = Y 38
(X X)2
(Y Y ) (X X ) (Y Y )
-7
-4
3
0
-1
4
-3
6
2
0
0
49
16
9
0
1
16
09
36
4
0
140
25
64
121
9
4
36
49
64
25
1
398
5
8
11
3
-2
-6
-7
-8
-5
1
0
= X = 320 = 32 and
n
10
= Y = 380 = 38
n
10
Coefficient of regression of Y on X is
93
(X X)(Y Y) =
bYX =
(X X)
140
Coefficient of regression of X on Y is
93
(X X)(Y Y) =
bXY =
(Y Y)2
398
= b XY (Y Y)
X X
X 32
= -0.2337 (y 38)
X
= -0.2337 y + 0.2337 x 38 + 32
X
= -0.2337 y + 40.8806
Equation of the line of regression of Y on X is
= b YX (X X)
YY
Y 38
Y
= -0.6643 (x 32)
= -0.6643 x + 38 + 0.6643 x 32
= -0.6642 x + 59.2576
Coefficient of Correlation
2
r
= bYX bXY
= -0.6643 x (-0.2337)
r
= 0.1552
r
r
=
=
0.1552
0.394
-35
-32
33
0
2
-24
+21
-48
-48
100
-93
Now we have to find the most likely mark, in statistics (Y) when marks in economics (X) are
30. y = -0.6643 x + 59.2575
42
2.6 CORRELATION
Types of Correlation
Positive Correlation
(If two variables deviate in same direction)
Negative Correlation
(If two variables constantly deviate in opposite direction)
2.7 KARL-PEARSONS COEFFICIENT OF CORRELATION
Correlation coefficient between two random variables X and Y usually denoted by r(X,
Y) is a numerical measure of linear relationship between them and is defined as
= Cov(X, Y) ,
r(X, Y)
X .Y
X=X;
n
1
n
XY X Y
Y=
Y
n
Solution
X
65
66
67
67
68
69
70
72
Y
67
68
65
68
72
72
69
71
U = X 68
-3
-2
-1
-1
0
1
2
4
U=0
Now
V = Y 68
-1
0
-3
0
4
4
1
3
U
9
4
1
1
0
1
4
16
UV
3
0
3
0
0
4
2
12
V
1
0
9
0
16
16
1
9
UV = 24 U 2 = 36 V 2 = 52
V=0
U 0
n =8 =0
V 8
V= n = 8=1
U=
UV
n
U V = 24 0 = 3
8
(1)
U = U 2 U 2 =
n
36 0 = 2.121
8
(2)
V = V 2 V 2 =
n
52 1 = 2.345
8
(3)
= r(U, V) =
r(X, Y)
Cov(U, V)
U .V
= 0.6031
Example :2.6.2
Let X
3
2.121 x 2.345
(by 1, 2, 3)
p.d.f. f (x) =
be
random
variable with
44
, 1 x 1 and let
E(X)
= x.f (x) dx
1x
= x. dx
1 2
E(X)
=0
E(Y)
= x 2.f (x) dx
11
22
22
1
=0
=x
dx =
2
2
1x
23
=
1
11
23
1=
1 2
2 3
1
3
E(XY) = E(XX )
3
= E(X )
x .f (x) dx =
r(X, Y) = (X, Y) =
Cov(X, Y)
=0
4 1
E(XY) = 0
=0
X Y
= 0.
Note : E(X) and E(XY) are equal to zero, noted not find x&y.
2.8 TRANSFORMS OF TWO DIMENSIONAL RANDOM VARIABLE
Formula:
&
f UV (u, V)
= f XY (x, y) (x, y)
(u, v)
Example : 1
If the joint pdf of (X, Y) is given by fxy(x, y) = x+y, 0 x, y 1, find the pdf of = XY.
Solution
Given
fxy(x, y) = x + y
Given
U = XY
Let
V=Y
x= v&y=V
45
x
1 x
u
y
y
u = V . v = V 2 ; u = 0; v = 1
y x
1
v = V
y
0
v
J = (x, y) =u
(u, v)
|J|=
y
u
(1)
u
V2
=11=1
1
V
(2)
= f xy (x, y) |J|
1
= (x + y) | v |
f uv (u, v)
1u
=V
+u
v
The range of V :
Since 0 y 1, we have 0 V 1
The range of u :
Given
0x1
0
(3)
( V = y)
0uv
Hence the p.d.f. of (u, v) is given by
fuv (u,v)
1u
+ v , 0 u v, 0 v 1
v v
Now
= f u,v (u, v) dv
u
1
+1 dv
v 1 1
= v + u. 1
u
p.d.f of u = XY
46
f uv (u, v) =
1u
+v
v v
0 u v, 0 v 1
TUTORIAL QUESTIONS
1. The jpdf of r.v X and Y is given by f(x,y)=3(x+y),0<x<1,0<y<1,x+y<1 and 0 otherwise.
Find the marginal pdf of X and Y and ii) Cov(X,Y).
2. Obtain the correlation coefficient for the following data:
X: 68
64
75
50
64
80
75
40
55
64
Y: 62
58
68
45
81
60
48
48
50
70
3.The two lines of regression are 8X-10Y+66=0, 4X-18Y-214=0.The variance of x is 9 find i)
The mean value of x and y. ii) Correlation coefficient between x and y.
4. If X1,X2,Xn are Poisson variates with parameter =2, use the central limit theorem to
find P(120Sn160) where Sn=X1+X2+Xn and n=75.
5. If the joint probability density function of a two dimensional random variable (X,Y) is
2
given by f(x, y) = x + , 0<x<1,0<y<2= 0, elsewhere Find (i) P(X>1/2)(ii) P(Y<X) and
(iii)
P(Y<1/2/ X<1/2).
6. Two random variables X and Y have joint density
Find Cov (X,Y).
7. If the equations of the two lines of regression of y on x and x on y are
respectively 7x-16y+9=0; 5y-4x-3=0, calculate the coefficient of correlation.
WORKEDOUT EXAMPLES
Example 1
The j.d.f of the random variables X and Y is given
f (x, y) =
8xy,
0 < x < 1,
0,
otherwise
0<y<x
(iii) f(y/x)
f ( y / x)
= f (x, y)
f (x)
= 8xy = 2y , 0 < y < x, 0 < x < 1
4x 3
Example 2
Let X be
x2
random
1 , 1 x 1 and let
= x.f (x) dx
E(X)
1
= x. 2 dx
1
x
1
= 2
2
E(X)
=0
E(Y)
= x 2.f (x) dx
1
1
1
=
=
0
22
2
3
2
=x .
1
1
2
dx
1x
23
=
1
11
23
1
3
1. 2
2 3
1
3
E(XY) = E(XX )
3
= E(X )
x .f (x) dx =
E(XY) = 0
x
4
=0
1
Cov(X, Y) = 0
r(X, Y) = (X, Y) =
X Y
= 0.
Note : E(X) and E(XY) are equal to zero, noted not find x&y.
Result
Marginal pdf g
Marginal pdf y
2y
4x , 0<x<1
F(y/x)
4y, 0<y<x
48
UNIT - III
RANDOM PROCESSES
Introduction
Random Variable
A function of the possible outcomes of
an experiment is X(s)
Outcome is mapped into a number x.
Random Process
A function of the possible outcomes of
an experiment and also time i.e, X(s, t)
Outcomes are mapped into wave from
which is a fun of time 't'.
49
Random Variables
Discrete
Time t
Continuous
Discrete
Continuous
2F (x 1 , x 2 ; t 1 , t2 )
x, x2
E X (t
) = xf (x, t 1
)dx
)=
xf (x, t 2
)dx
E X (t
)=
xf (x, t 1 + C )dx =
=EX
Thus E X
2)
=EX t
(
xf (x, t )dx
1
)
1)
f (x 1 , x 2 ; t 1 , t 2 ) = f (x 1 , x 2 ; t 1 + C, t 2 + C )x 1 , x2 and C.
E (X12 ), E (X 22 ), E (X1 , X2 )denote change with time, where
X = X(t1); X2 = X(t2).
3.3.5 Strongly Stationary Process
A random process is called a strongly stationary process or Strict Sense
Stationary
Process (SSS Process) if all its finite dimensional distribution are invariance
under translation of time 't'.
fX(x1, x2; t1, t2) = fX(x1, x2; t1+C, t2+C)
fX(x1, x2, x3; t1, t2, t3) = fX(x1, x2, x3; t1+C,
t2+C, t3+C) In general
fX(x1, x2..xn; t1, t2tn) = fX(x1, x2..xn; t1+C, t2+C..tn+C) for any t1 and any
real number
C.
3.3.6 Jointly - Stationary in the Strict Sense
{X(t)} and Y{(t)} are said to be jointly stationary in the strict sense, if the
joint distribution of X(t) and Y(t) are invariant under translation of time.
Definition Mean:
X
t =EX t
, < t <
X (t) is also called mean function or ensemble average of the random process.
52
Let X(t1) and X(t2) be the two given numbers of the random process {X(t)}. The auto
correlation is
R XX (t 1 , t 2 ) = E {X (t 1 )xX (t2 )}
AX
C XX (t 1 , t 2 ) = E
=R
{ X (t
XX ( 1
t ,t
) E (X (t 1 )) } X (t 2 ) E (X (t2 ))
) E X
t EX
(1 )
t
(
2)
Correlation Coefficient
The correlation coefficient of the random process {X(t)} is defined as
XX (t 1 , t2 ) =
C XX (t 1 , t2 )
Var X (t 1 )xVar X (t2 )
1
f ( ) = 2
,0 < C < 2
0 ,otherwise
E[X(t)]
X (t )f ( ) d
= 2 A ( t + ) 2 d
0
2
= A sin ( t + )
0
2
= A Sin (2 + t ) Sin (t + 0)
2
A
2 [Sin t sin t]
= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
Example:3.6.2 which are not stationary
Examine whether the Poisson process {X(t)} given by the probability law P{X(t)=n] =
e t (t) , n = 0, 1, 2,
. n
Solution
We know that the mean is given by
E X (t ) =
nP (t)
n
n =0
ne
n =0
=
(t)
n 1
= et
=e
n =1
( t) n
(t)n
n 1n=1
t (t)
+
0!
1!
2
+ ...
54
= ( t )e t 1 + t + (t) + ...
1
= (t )e t et
= t , depends on t
Hence Poisson process is not a stationary process.
3.7 ERGODIC RANDOM PROCESS
Time Average
The time average of a random process {X(t)} is defined as
X
1
=
2T
T X (t )dt
T
Ensemble Average
The ensemble average of a random process {X(t)} is the expected value of the random
variable X at time t
Ensemble Average = E[X(t)]
Ergodic Random Process
{X(t)} is said to be mean Ergodic
If lim X T =
lim
T X (t )dt =
2T T
lim Var X T = 0
T
P X (t
n +1
n +1
PX t
( n +1
/ X (n ) +
x
n +1
, x (t n 1
=
/ x tn) x
Where t 0 t 1 t 2 ... t n tn +1
Examples of Markov Process
)=
=
x n 1 ...x (t 0 x
55
1.The probability of raining today depends only on previous weather conditions existed
for the last two days and not on past weather conditions.
2.A different equation is markovian.
Classification of Markov Process
Markov Process
Continuous
Parameter
Markov Process
Discrete
Parameter
Markov Process
Discrete
Parameter
Markov Chain
Continuous
Parameter
Markov Chain
()
i.e. P X t = n =
P (t) = e
n
(t)n
n!
et (t)n , n = 0,1,2,...
n!
56
P X (t
) = n1 =
X (t 2 ) = n = P X (t
(
t
=n2
)/
>t
X (t ) = n , t
=PX
) .P X (t
=n
(t1 )n
t t
{ ( t 2 t1 )}n
=n1
, n 2 n1
=
n1
n 2 n1
e t 2 . n 2 .t 1n1 (t 2 t1 )n 2 n1
n,
n1
, n2
, otherwise
(0,1)
(0,-1)
Note: The process is an example for a discrete random process.
* Mean and Auto Correlation P{X(t) = 1} and P{X(t) = 1" for any t.
57
TUTORIAL QUESTIONS
1.. The t.p.m of a Marko cain with three states 0,1,2 is P=
and the initial state distribution is
Find (i)P[X2=3] ii)P[X3=2, X2=3, X1=3, X0=2]
2. Three boys A, B, C are throwing a ball each other. A always throws the ball to B and B always
throws the ball to C, but C is just as likely to throw the ball to B as to A. S.T. the process is
Markovian. Find the transition matrix and classify the states
3. A housewife buys 3 kinds of cereals A, B, C. She never buys the same cereal in successive
weeks. If she buys cereal A, the next week she buys cereal B. However if she buys P or C the
next week she is 3 times as likely to buy A as the other cereal. How often she buys each of the
cereals?
4. A man either drives a car or catches a train to go to office each day. He never goes 2 days in a
row by train but if he drives one day, then the next day he is just as likely to drive again as he is
to travel by train. Now suppose that on the first day of week, the man tossed a fair die and drove
rd
to work if a 6 appeared. Find 1) the probability that he takes a train on the 3 day. 2). The
probability that he drives to work in the long run.
WORKED OUT EXAMPLES
th
,0 < C <
2 f ( ) = 2
0 ,otherwise
E[X(t)]
= X (t )f ( ) d
58
1
d
2
=2 A(t+)
0
= A sin ( t + ) 2
0
2
= A Sin (2 + t ) Sin (t + 0)
2
A
2 [Sin t sin t]
= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
Example:3.which are not stationary .Examine whether the Poisson process {X(t)} given by the
(t)
E X (t ) =
nP (t)
n
n =0
= ne
( t) n
n
n =0
=e
(t)
n 1
n =1
= et
(t)n
n 1n=1
( t )
= e t t +
0!
= ( t )e t 1+
2
+ ...
1!
t
1
(t) 2
+
= (t )e t et
= t , depends on t
Hence Poisson process is not a stationary process.
59
+ ...
UNIT - 4
CORRELATION AND SPECTRAL DENSITY
Introduction
The power spectrum of a time series x(t) describes how the variance of the data x(t) is
distributed over the frequency components into which x(t) may be decomposed. This
distribution of the variance may be described either by a measure or by a statistical
cumulative distribution function S(f) = the power contributed by frequencies from 0 upto
f. Given a band of frequencies [a, b) the amount of variance contributed to x(t) by
frequencies lying within the interval [a,b) is given by S(b) - S(a). Then S is called the
spectral distribution function of x.
The spectral density at a frequency f gives the rate of variance contributed by
frequencies in the immediate neighbourhood of f to the variance of x per unit frequency.
4.1 Auto Correlation of a Random Process
Let X(t1) and X(t2) be the two given random variables. Then auto correlation is
RXX (t1, t2) = E[X(t1) X(t2)]
Mean Square Value
Putting t1 = t2 = t in (1)
RXX (t, t) = E[X(t) X(t)]
2
RXX (t, t) = E[X (t)]
Which is called the mean square value of the random process.
Auto Correlation Function
Definition: Auto Correlation Function of the random process {X(t)} is
RXX = () = E{(t) X(t+)}
Note: RXX () = R() = RX ()
PROPERTY: 1
The mean square value of the Random process may be obtained from the auto correlation
function.
RXX(), by putting = 0.
is known as Average power of the random process {X(t)}.
PROPERTY: 2
RXX() is an even function of .
RXX () = RXX (-)
PROPERTY: 3
If the process X(t) contains a periodic component of the same period.
PROPERTY: 4
If a random process {X(t)} has no periodic components, and E[X(t)] = X then
60
1
1
Solution:
(i) Given
R
R
Sin n(
RXX5()
RXX(), the
given function is not an
lim R XX ( ) = X
|T|
|T|
Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by
2
R XX ( ) = 18 + 6 + 2
Solution
2
Given R XX ( ) = 18 + 6 + 2
= lim RXX ()
X 2
| |
= lim 18 +
6+
2
2
= 18 + lim 6 +
| |
| |
61
2
= 18 + 6 +
(t)
EX
= 18 + 0
= 18
= 18
18
=
2
2
= E[X (t)] - {E[X(t)]}
Var {X(t)}
We know that
EX
(t)
= RXX(0)
2
55
= 18 + 6 + 0 = 3
= 1
3
Example : 3
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
Consider, RXX'(t1, t2) = E{X(t1)X'(t2)}
X (t 2 + h ) X (t2 )
h
= E X (t 1 )lim
n 0
X (t
)X (t 2 + h ) X (t 1 )X (t 2 )
h
= lim E
h 0
= lim
XX
(t 1, t 2 + h ) R X (t, t 2 )
h
h 0
R XX (t 1 , t2 )
t2
R ' (t, t )
=
XX
2
t1
R XX (t 1 , t2 )
=
=
(1)
by (1)
t, t2
Auto Covariance
The auto covariance of the process {X(t)} denoted by CXX(t1, t2) or C(t1, t2) is defined as
62
MA6451
XX
(1
=E X
t ,t
( ( 1 ))
t E X t
(2 ) ) }
(2
t EX t
C XX (t 1 , t2 )
XX (t 1 , t2 ) =
2)
=E X
t ,t
1)
( 1 ))
( 2
t E Y t
))
t EY t
The relation between Mean Cross Correlation and cross covariance is as follows:
XY
(t, t )
1
=R
XY
(t, t ) E
1
X (t)E Y (t
1
Definition
Two random process {X(t)} and {Y(t)} are said to be uncorrelated if
C XY (t 1 , t 2 )
0, t 1 , t2
XY (t 1 , t2 )
c XY (t 1 , t2 )
Var (X (t 1 ))Var (X (t2 ))
PROPERTY : 3
If {X(t)} and {Y(t)} are two random process then,
R XY ( )
R XX (0 ) + R YY (0)
2
1
2 , 0 2
= sin (t +
t +
).cos (t
1
+)
= 1
sin ( t + + ) cos (t + ) d
2 0
1 2 1 {sin (t + + + t + )
= 2 0 2
= 1
[ + sin [ t + + t ]}d
2 0
2
sin 2 t + +2 + sin
]
64
MA6451
PROCESSES
cos ( 2 t + + 2 )
+ sin ( )
4
1
2
0
cos (2 t + ) cos (2 t + + 0 )
=
+
+ sin ( 2 0)
4
2
2
cos
2
t
+
cos
2
t
+
)
(
1
(
)
=
+
+ 2 sin
4
2
2
1
= 4 [0 + 2 sin ]
= sin
(3)
R XY (t, t ) = 2 sin
2
4.5 SPECTRAL DENSITIES (POWER SPECTRAL DENSITY)
INTRODUCTION
(i) Fourier Transformation
(ii) Inverse Fourier Transform
(iii) Properties of Auto Correlation Function
(iv)Basic Trigonometric Formula
(v) Basic Integration
4.5.1 SPECIAL REPRESENTATION
Let x(t) be a deterministic signal. The Fourier transform of x(t) is defined as
F x (t ) = x (w ) =
x (t )e i tdt
1
=
X ( ) e itd.
2
Definition
The average power P(T) of x(t) over the interval (-T, T) is given by
P (T ) =
2T
T x 2 (t )dt
65
1 X T ( )
(1)
= 2
2T
Definition
The average power PXX for the random process {X(t)} is given by
PXX = lim 1
T
E X 2 (t
2 T
XT ()
lim
) dt
2
2T
(2)
= R XX ( ) e id
Thus,
SXX (f ) = R XX
( ) e i2 fd
SXX ( )
= R
XX
( ) e id
SXX (f ) = R XX
( ) e i2 fd
( ) =
R
XX
1 S
( ) = 1
(or) R
XX
( ) e id
XX
(f )e i2 fd
XX
MA6451
PROCESSES
The value of the spectral density function at zero frequency is equal to the total area
under the group of the auto correlation function.
SXX (f ) =
R XX ( ) e i2 fcd
Taking f = 0, we get
Sxx(0) = R XX ( ) d
TUTORIAL QUESTIONS
1. Find the ACF of {Y(t)} = AX(t)cos (w0+ ) where X(t) is a zero mean stationary random
process with ACF
A and w0 are constants and is uniformly distributed over (0, 2 ) and
independent of X(t).
2. Find the ACF of the periodic time function X(t) = A sinwt
3.If X(t) is a WSS process and if Y(t) = X(t + a) X(t a), prove that
), where A and are constants and is a random variable, uniformly ),
4. If X(t) = A sin(
distributed over (-
5.. Let X(t) and Y(t) be defined by X(t) = Acos t + Bsin t and Y(t) = B cos
t Asin t
Where
is a constant and A nd B are independent random variables both having zero mean and
varaince
. Find the cross correlation of X(t) and Y(t). Are X(t)
and Y(t) jointly W.S.S
processes?
6. Two random processes X(t) and Y(t) are given by X(t) = A cos (
), Y(t) = A sin(
), where A and
are constants and
is uniformly distributed over (0, 2
). Find the cross
correlation of X(t) and Y(t) and verify that
.
7..If U(t) = X cos t + Y sin t and V(t) = Y cost + X sint t where X and Y are independent random
2
2
varables such that E(X) = 0 = E(Y), E[X ] = E[Y ] = 1, show that U(t) and V(t) are not jointly
W.S.S but they are individually stationary in the wide sense.
8. Random Prosesses X(t) and Y(t) are defined by X(t) = A cos (
), Y(t) = B cos (
)
where A, B and
are constants and
is uniformly distributed over (0, 2 ). Find the cross
correlation and show that X(t) and Y(t) are jointly W.S.S
WORKEDOUT EXAMPLES
Example 1.Check whether the following function are valid auto correlation function (i) 5 sin n
(ii)
1 + 92
Solution:
(i) Given
RXX() = 5 Sin n
RXX () = 5 Sin n() = 5 Sin n
67
1
(ii) Given RXX () = 1 + 92
Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by
2
R XX ( ) = 18 + 6 + 2
Solution
2
Given R XX ( ) = 18 + 6 + 2
= lim RXX ()
X2
| |
= lim 18 +
6+
= 18 + lim 2
2
| | 6 +
2
= 18 + 6 +
| |
(t)
EX
= 18 + 0
= 18
= 18
18
(t)
= RXX(0)
2 = 55
= 18 + 6 + 0
3
1
=
3
Example : 3
68
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
(t 2 + h ) X (t2 )
X
h
= E X (t 1 )lim
n 0
X (t
)X (t 2 + h ) X (t 1 )X (t 2 )
h
= lim E
h 0
= lim
XX
(t 1, t 2 + h ) R X (t, t 2 )
1
h 0
R XX (t 1 , t2 )
t2
R ' (t, t )
=
XX
2
t1
R XX
(1)
(t 1 , t 2 )
by (1)
t, t2
Example :4
Two
random
process
{X(t)}
and
{Y(t)}
are
given
by
X(t) = A cos (t+), Y(t) = A sin (t + ) where A and are constants and '' is a uniform
random variable over 0 to 2. Find the cross correlation function.
Solution
By def. we have
RXY() = RXY (t, t+)
Now, RXY (t, t+) = E[X(t). Y(t+)]
= E [A cos (t + ). A sin ( (t+) + )]
= A 2E sin { ( t + ) + } cos (t + )
Since '' is a uniformly distributed random variable we have
f(0) =
1 , 0 2
2
69
= sin (t +
t +
).cos (t + )
1 2 sin ( t + + ) cos (t + ) d
2 0
1 2 1 {sin (t + + + t + )
= 2 0 2
= 1
2 0
2
+ sin [ t + + t ]}d
2
sin 2 t + + 2 + sin
1 cos ( 2 t + + 2 ) + sin ( ) 2
(
4
2
0 )
cos
2
t
+
cos
2
t
+
+
0
(
1
)
2
=
+
+ sin ( 2 0)
4
2
)
(
1 cos ( 2 t + ) + cos 2 t + + 2 sin
=
4
2
2
1
= 4 [0 + 2 sin ]
=
sin
R XY (t, t ) = 2 sin
2
(3)
70
UNIT 5
LINEAR SYSTEM WITH RANDOM INPUTS
Introduction
f a X (t ) a x (t
2
1 1
) = af
11
X (t ) a f X
2
(t)
Y(t + h) = f X(t + h)
, then f is called a time invariant system or X(t) and Y(t) are said to
Output Y (t)
Linear System
71
h(t)
(a)
Input X (t)
LTI System
Output Y (t)
h(t)
(b)
a) Shows a general single input - output linear system
b) Shows a linear time invariant system
5.3 REPRESENTATION OF SYSTEM IN THE FORM OF CONVOLUTION
Y (t ) = h (t )x X (t)
Y (t ) = h (u ) X (t u )du
= h (t u )X (u )du
Y (t ) =
h (u )X (t u )du , then the system is a linear time - invariant system.
Property 2:
If the input to a time - invariant, stable linear system is a WSS process, then the output
will also be a WSS process, i.e To show that if {X(t)} is a WSS process then the output {Y(t)} is
a WSS process.
Property 3:
If
{X(t)}
is
WSS
process
and
if
Y(t)
h (u )X (t
)du
, then
)du
, then
R XY ( ) = R XX ( ) x h ()
Property 4 :
If {(X(t)}
is
WSS
process
and
R YY ( ) = R XY ( ) x h ()
72
if
Y(t)
h (u )X (t
MA6451
PROCESSES
Property 5:
If
{X(t)}
is
WSS
process
and
if
Y(t)
h (u )X (t
)du
, then
R YY ( ) = R XX ( ) x h ( ) x h ()
Property 6:
. Property 7:
If
{X(t)}
is a
WSS process
and
if
Y(t) =
h (u )X (t u )du ,
then
Note:
XY
Instead of taking R
XY (
( )
( ) (
)
( = E X t Y t +
)
b) R YY ( ) = R XY ( ) xh ()
c) R YY ( ) = R XX ( ) x h ( ) x h ()
REMARK :
(i) We have written H ( ) H * ( ) = H() 2 because
H ( ) = F h ( )
H
I
* ( ) = F h ( )
= F (h ())
= H()
(ii) Equation (c) gives a relationship between the spectral densities of the input and output process in
the system.
(iii) System transfer function:
We call H ( ) = F {h ()} as the power transfer function or system transfer function.
SOLVED PROBLEMS ON AUTO CROSS CORRELATION FUNCTIONS OF INPUT
AND OUTPUT
Example :5.4.1
Find the power spectral density of the random telegraph signal.
Solution
We know, the auto correlation of the telegraph signal process X(y) is
73
MA6451
PROCESSES
R XX ( ) = e2
SXX ( ) = R XX ( ) e id
= e 2 e id + e 2 e i d
= when < 0
= when > 0
e ( 2 i ) d + e0 2 e id
e ( 2 i )
( 2 i )
1
= (2 i ) e
=
e ( 2 + i )
( 2 + i ) 0
1
(2 + i ) e
(0 1)
1
1
(2 i ) (1 0 ) (2 + i )
1
= (2 i ) + (2 + i )
1
= (2 i ) (1 0) +
2 + i + 2 i
= (2 i )(2 + i )
4
SXX ( ) = 4 2 + 2
Example : 5.4.2
A linear time invariant system has a impulse response h (t ) = e t U (t). Find the
power spectral density of the output Y(t) corresponding to the input X(t).
Solution:
Given X(t) - Input
Y(t) - output
2
SYY() - |H()| SXX()
74
MA6451
Now H ( ) = h (t )e i tdt
= h (t )e
it
dt + e
e itdt
= 0+ e ( + i)tdt
0
e ( + i)t
( + i ) 0
(e
e 0
e e 0
1
= ( + i )
1
= +i
= ( + i )
|H()|
1
= +i
1
=
2 + 2
1
SYY ( ) = 2 + 2 SXX ()
TUTORIAL QUESTIONS
1. State and Prove Power spectral density of system response theorem.
2. Suppose that X(t) is the input to an LTI system impulse response h1(t) and that Y(t) is the
input to another LTI system with impulse response h 2 (t). It is assumed that X(t) and Y(t) are
jointly wide sense stationary. Let V(t) and Z(t) denote that random processes at the respective
system outputs. Find the cross correlation of X(t) and Y(t).
3. The input to the RC filter is a white noise process with ACF
. If the
frequency response
find the auto correlation and the mean square value of the
output process Y(t).
4. A random process X(t0 having ACF
, where P and are real positive
t ,t > 0
where
e
constants, is applied to the input of the system with impulse response h(t) =
0,t < 0
75
MA6451
PROCESSES
is a real positive constant. Find the ACF of the networks response Y(t). Find the cross
correlation .
WORKEDOUT EXAMPLES
Example: 1
Find the power spectral density of the random telegraph signal.
Solution
We know, the auto correlation of the telegraph signal process X(y) is
R XX ( ) = e2
SXX ( ) = R XX ( ) e id
= e 2 e id + e 2 e i d
= when < 0
= when > 0
e ( 2 i ) d + 0e 2 e id
e ( 2 i )
( 2 i )
1
= (2 i )
=
e ( 2 + i )
( 2 + i ) 0
1
(2 + i )
(0 1)
1
1
(2 i ) (1 0 ) (2 + i )
1
= (2 i ) + (2 + i )
1
= (2 i ) (1 0) +
2 + i + 2 i
= (2 i )(2 + i )
4
SXX ( ) = 4 2 + 2
76
p ( xi ) 1
f ( x )dx 1
F ( x) P X x
F ( x) P X x f ( x )dx
Mean E X x
Mean E X xf ( x )dx
p ( xi )
i
i
E X
5
6
p ( xi )
xi
i
Var X E X
Moment = E X r
E X
2
M.G.F
f ( x )dx
E X
xr p
Var X E X
Moment = E
M.G.F
E X
r
X
f ( x )dx
MX
t=E
tX
tx
= e p ( x)
x
tX
MX t=E e
tx
=e
f ( x )dx
4) E aX + b = aE X + b
5) Var aX + b = a2 Var X
6) Var aX bY = a2Var X + b 2Var Y
7) Standard Deviation = Var X
8) f ( x) = F ( x)
9) p( X > a) = 1 - p( X a)
10) p A / B = p A B , p B
p B
(Mean)
t 0
nd
t
The co-efficient of
=E X
(r
th
t 0
r!
13) Limitation of M.G.F:
i)
A random variable X may have no moments although its m.g.f exists.
ii)
A random variable X can have its m.g.f and some or all moments, yet the
m.g.f does not generate the moments.
iii)
A random variable X can have all or some moments, but m.g.f does not
exist except perhaps at one point.
14) Properties of M.G.F:
i)
ii)
iii)
If Y = aX + b, then MY
t = e bt M X at .
t = M X t MY t .
Binomial
nc x p x qn x
q +
pe
Poisson
e x
Geometric
x!
q x 1 p (or) q x p
e 1
e t
pet
Mean
Variance
np
npq
1 - qet
p2
Uniform
f ( x) b
Exponential
Gamma
(b a)
(b a )
t
, x 0, 0
otherwise
0,
6
,axb
f ( x)
at
e e
12
otherwise
0,
5
bt
a
b
e x x 1
t
1
, 0 x , 0
(1 t )
f ( x) ( )
7
Normal
1 x
1
f ( x)
2 2
et
2
e
.
dx
d
17) Function of random variable: fY ( y) f X ( x) y
pij
1)
f ( x , y )dxdy 1
P x,y
2) Conditional probability function X given Y P X xi / Y yi
.
P ( y)
Conditional probability function Y given X P Y yi / X xi P x , y .
P ( x)
P X a / Y b P X a ,Y
P (Y b)
f ( x / y) f ( x , y) .
f ( y)
f ( y / x) f ( x , y) .
f ( x)
b a
P X < a ,Y < b =
f ( x , y )dxdy
0 0
f ( x , y )dy
f ( x , y )dx
7) P ( X + Y 1) = 1 - P ( X + Y < 1)
Cov ( X ,Y )
X
8) Correlation co efficient (Discrete): ( x , y) =
1
Cov ( X ,Y ) = XY - XY ,
n
, Y = 1
- X
Y 2 -Y
Cov ( X ,Y )
9) Correlation co efficient (Continuous): ( x , y) = X Y
Cov ( X ,Y ) = E X ,Y - E X E Y , X = Var ( X ) , Y
= Var (Y )
X ,Y = xyf ( x , y )dxdy .
b
xy
x - y x
y -
x - x
yx
x x
y E ( y) ,
Regression curve X on Y is x E x / y xf x /
r y
b y
yx r
x
xy
y dx
Regression curve Y on X is y E y / x yf y / x dy
14) Transformation Random Variables:
(One dimensional random variable)
dx
fY ( y) f X ( x)
d
u
y
y
u
(Two dimensional random variable)
v
y
x
f
fUV ( u , v) XY ( x , y)
v
x
15) Central limit theorem (Liapounoffs form)
2
i
i 1
and variance
i2
i 1
n .
16) Central limit theorem (Lindberg Levys form)
If X1, X2, Xn be a sequence of independent identically distributed R.Vs with E[X i]
2
n 2 as n .
Sn
n
( for n variables),z
Note: z n
5) Time average:
T
7) Mean ergodic:
Let X ( t ) be a random process with mean E X ( t ) and time average
XT ,
T as T (i.e)
Lt var
Note: X
T
Variance:
E X
Var X
1
2T
2T
2T
RXX ( )C XX ( ) d
represented by P X (t
n1
)
x
n1
)
x
/X
(t
n
(
, Xt
n
P X (t
n 1
n1
)
x
)
)
x n1 ... X (t 0 x
n 1
P X a /X a
,X
a
, ... X a P X a
n
n1
n1
n2
n2
)
nx
X
/ (t
/X
n1
n1
(n)
P .
ij
e t t x
, x 0,1, 2, ...
X(t)x
x!
2 2
( t )
t t , Var X ( t ) t .
SXX RXX ei d
AX
i
S e d
XX
C XX ( ) RXX ( ) E X ( t ) E X (t ) 0
4) Cross power spectrum to Cross correlation:
XY
S ei d
2 XY
5) General formula:
eax
e cos bx dx a b
2
a cos bx b sin bx
2
eax
a sin bx b cos bx
i)
ii)
iii)
ax
eax sin bx dx
x ax x
a
2
ei ei
iv)
sin
v)
cos ei ei
2
2i
a 2
2
a2
4
X ( t ) a f X (t) a
1 1
2
2
1
1
2) Time invariant system:
Let Y ( t ) f
f X (t)
2
X ( t ) . If Y (t h) f X (t h) then f is called a
Y ( t ) h( u ) X (t u ) du
j t
5) Contour integral:
eimx
a
6)
F 1
ma
(One of the result)
ea
2a
h( t ) dt
3. Let the random variable X have the PDF f(x) = 2 e 2 , x >0 Find the moment
generating function, mean and variance.
4. A die is tossed until 6 appear. What is the probability that it must
tossed more than 4 times.
5. A man draws 3 balls from an urn containing 5 white and 7 black
balls. He gets Rs. 10 for each white ball and Rs 5 for each black
ball. Find his expectation.
6. In a certain binary communication channel, owing to noise, the
probability that a transmitted zero is received as zero is 0.95 and the
probability that a transmitted one is received as one is 0.9. If the
probability that a zero is transmitted is 0.4, find the probability that (i) a
one is received (ii) a one was transmitted given that one was received
th
x
7. Find the MGF and r moment for the distribution whose PDF is f(x) = k e ,
x >0. Find also standard deviation.
8. The first bag contains 3 white balls, 2 red balls and 4 black balls. Second
bag contains 2 white, 3 red and 5 black balls and third bag contains 3
white, 4 red and 2 black balls. One bag is chosen at random and from it 3
balls are drawn. Out of 3 balls, 2 balls are white and 1 is red. What are
the probabilities that they were taken from first bag, second bag and third
bag.
9. A random variable X has the PDF f(x) = 2x, 0 < x < 1 find (i) P (X < )
(ii) P ( < X < ) (iii) P ( X > / X > )
10. If the density function of a continuous random variable X is given by
ax
0x1
a
1x2
f(x) = 3a ax
2x3
0
otherwise
(1) Find a (2) Find the cdf of X
r
x-1
15. A box contains 5 red and 4 white balls. A ball from the box is taken our
at random and kept outside. If once again a ball is drawn from the box,
what is the probability that the drawn ball is red?
16. A discrete random variable X has moment generating function
1 3 t 5
e Find E(x), Var(X) and P (X=2)
M X(t) =
4 4
17. The pdf of the samples of the amplitude of speech wave foem is found to
decay exponentially at rate , so the following pdf is proposed f(x) = Ce|
x|
, - < X < . Find C, E(x)
18. Find the MGF of a binomial distribution and hence find the mean and variance.
. Find the recurrence relation of central moments for a binomial distribution.
19. The number of monthly breakdowns of a computer is a RV having a poisson
distribution with mean equal to 1.8. Find the probability that this computer will
function for a month (a) without a breakdown, (b) Wish only one breakdown, (c)
Wish at least one break down.
20. Find MGF and hence find mean and variance form of binomial distribution.
21. State and prove additive property of poisson random variable.
22. If X and Y are two independent poisson random variable, then show that
probability distribution of X given X+Y follows binomial distribution.
23. Find MGF and hence find mean and variance of a geometric distribution.
24. State and prove memory less property of a Geometric Distribution.
-2x1 3 x2
5. If two random variable have hoing p.d.f. f(x1, x2) = ( 2/ 3) (x1+ 2x2) 0< x1
<1, 0< x2 < 1
6. Find the value of k, if f(x,y) = k xy for 0 < x,y < 1 is to be a joint density
function. Find P(X + Y < 1 ) . Are X and Y independent.
7. If two random variable has joint p.d.f. f(x, y) = (6/5) (x +y ), 0 < x < 1 ,
0< y <1.Find P(0.2 < X < 0.5) and P( 0.4 < Y < 0.6)
x 2 y2
, x ,y 0, find
the p. d. f. of x +y .
10. Two random variable X and Y have joint f(x y) 2 x y, 0< x <1, 0< y < 1. Find
the Marginal probability density function of X and Y. Also find the conditional
density unction and covariance between X and Y.
11. Let X and Y be two random variables each taking three values 1, 0 and
1 and having the joint p.d.f.
X
Y
Prove that X and Y have different
-1
0
1
expections. Also Prove that X and Y are
-1
0
0.1
0.1
uncorrelated and find Var X and Var Y
0
0.2
0.2
0.2
1
0
0.1
0.1
12. 20 dice are thrown. Find the approximate probability tat the sum obtained
is between 65 and 75 using central limit theorem.
13. Examine whether the variables X and Y are independent whose joint density
.
is f(x ,y) = x e
xy x
, 0< x , y <
14. Let X and Y be independent standard normal random variables. Find the pdf
of z =X / Y.
15. Let X and Y be independent uniform random variables over (0,1) . Find
the PDF of Z = X + Y
UNIT-III CLASSIFICATION OF RANDOM PROCESS
PART B
1. The process { X(t) } whose proabability distribution is given by
P [ X(t) = n] =
a
t n1
n 1
, n 1, 2...
1 at
a
t , n 0
1 at
3. Let X(t) be a Poisson process with arrival rate . Find E {( X (t) X (s) } for t > s.
4. Let { Xn ; n = 1,2..} be a Markov chain on the space S = { 1,2,3} with on step
0 1 0
1 0 0
5. Consider a random process X(t) defined by X(t) = U cost + (V+1) sint, where U
2
and V are independent random variables for which E (U ) = E(V) = 0 ; E (U ) = E
( V ) = 1 (1) Find the auto covariance function of X (t) (2) IS X (t) wide sense
stationary? Explain your answer.
6. Discuss the pure birth process and hence obtain its probabilities, mean and
variance.
7. At the receiver of an AM radio, the received signal contains a cosine carrier
signal at the carrier frequency with a random phase that is uniform
distributed over ( 0,2). The received carrier signal is X (t) = A cos(t + ). Show
that the process is second order stationary
8. Assume that a computer system is in any one of the three states busy, idle and
under repair respectively denoted by 0,1,2. observing its state at 2 pm each day,
0.6 0.2 0.2
rd
we get the transition probability matrix as P
0.1 0.8 0.1 . Find out the 3
0.6
0
0.4
rd
11. Show that the process X (t) = A cost + B sin t (where A and B are random
2
Unit 4
Correlation and Spectrum Densities
V+ TEAM
V+
TEAM