Topic 5 - Intro To Random Processes
Topic 5 - Intro To Random Processes
Random Processes
A RANDOM VARIABLE X, is a rule for assigning to every
outcome, , of an experiment a number X().
Note: X denotes a random variable and X() denotes a particular value.
a sample path
time
Random Process
A random processes can be either discrete-time or continuous-time.
A random processes can be either discrete valued or continuous valued.
Time average = 1
X ( , t ) Random
Process
1+n(t)
1
=1
Time average = 0
=0
0+n(t)
Random
Event
Random (stochastic):
Many (infinitely) possible values for
any time instance. Therefore, we can
predict only the expected value of
the signal for a desired time
instance.
Stock market, speech, medical
data, communication signals,
Stochastic Processes
Chapter 4a
Stochastic Processes-2
Chapter 4a
Random Processes
Recall that a random variable, X, is
a rule for assigning to every
outcome, , of an experiment a
number X().
Note: X denotes a random
variable and X() denotes a
particular value.
A random process X(t) is a rule for
assigning to every , a function X
(,t) .
Note: for notational simplicity
we often omit the dependence
on .
Analytical description X(t) =X(t,)
where is an outcome of a random
event.
Statistical description: for any
integer n and any choice of (t1,
t2, . . ., tn) the joint PDF of (X(t1), X
( t2), . . ., X( tn) ) is known.
10
= A sin y =
A x
A < x < A
(5-1c)
elsewhere
11
ij = min (ti , t j )
(5-2)
12
time, t
13
,..., X t k
x1 ,..., X tk xk
(5-3)
(5-4a)
(5-4b)
Cov( X t , X s ) = E (( X t t )( X s s ))
(5-4c)
Cov ( X t , X s )
2
t
2
s
(5-4d)
15
Z=
"
T
!T
X(t)dt.
(5-5a)
E[| Z | ] = E[ZZ*] =
=
!T
!T
" "
!T
!T
" "
(5-5b)
16
17
mx = x = x ( ) f ( ) d =
x 2 =
1
A sin (0t + )
d = 0 (5-6a)
2
2
1
A
A2 sin 2 (0t + )
d =
2
2
(5-6b)
18
1
= lim
T T
A sin (0t + ) dt = 0
(5-7a)
Variance
x2 (t )
1 T 2 2
A2
= lim A sin (0t + ) dt =
0
2
T T
(5-7b)
FX t
,..., X t k
(x1 ,..., xk ) = FX
t1 + ,..., X t k +
(x1 ,..., xk )
(5-8a)
20
21
()
X t = Acos ! 0t + ! 0
(5-8b)
22
PDF of X(t)
Time, t
23
()
R (! ) = R (!! )
R (0) ! R (! )
()
(5-9a)
symmetry
(5-9b)
(5-9c)
24
Given two random variables X(t1) and X(t2), a measure of linear relationship between
them is specified by E[X(t1)X(t2)]. For a random process, t1 and t2 go through all
possible values, and therefore, E[X(t1)X(t2)] can change and is a function of t1 and t2.
The autocorrelation function of a random process is thus defined by
" "
( )
# # x x f ( x , x ) dx dx
1 2
(5-10)
!" !"
Correlation or second moment of a real random process with itself at two time
instants (shown below separated by sec).
x(t)
time, t
T
25
Autocorrelation of WSS RP
1
R()
0
Time lag,
26
X (t ) = a cos(0t + ),
~ U (0,2 ).
(5-11a)
This gives
(t ) = E{ X (t )} = aE{cos( 0t + )}
= a cos 0t E{cos } a sin 0t E{sin } = 0,
X
since E{cos } = 21
(5-11b)
cos d = 0 = E{sin }.
Similarly
f X (t ), X (t ),..., X (t ) x1 , x2 ,..., xn
1
2
n
= f X (t +T ), X (t +T ),..., X (t +T ) x1 , x2 ,..., xn
1
0
2
0
n
0
(5-12a)
So that
mX (t +T0 ) = mX (T0 )
Rx (t, ) = Rx (t + T0 , )
(5-12b)
(5-12c)
X(t) =
Consider the correlation function
"
# a h(t ! nT )
n
n=!"
"
(5-13a)
m=!"
(5-13b)
E[X(t)X(t + ! ) = !! E[an am ]h(t " nT )h(t + ! " mT ) = A! h(t " nT )h(t + ! " nT )
n m
n
(5-13c)
which is easily seen to be periodic in T.
Question: What is the E [X(t)]?
29
30
( )
()( )
" "
(5-14b)
31
Rxy ( ) = Rxy ( )
Rxy ( ) Rx ( 0 ) Ry ( 0 )
Rxy ( ) 12 $& Rx ( 0 ) + Ry ( 0 )%'
(5-15a)
(5-15b)
(5-15c)
__________________
Uncorrelated:
Rxy ( ) = x (t ) y (t + ) = x y
Orthogonal:
Rxy ( ) = 0
(5-15d)
(5-15e)
Definition
$ X ( )
T
S x ( ) = lim &
T &
T
(
%
'
'
)
(5-16a)
Rx ( ) S x ( )
(5-16b)
1
Px = Rx ( 0 ) =
2
s ( ) d = 2 s ( f ) df
x
(5-16c)
33
34
Gaussian Process
As noted earlier
A random process {X(t)} is said to be a Gaussian random
process if all finite collections of the random process,
X1=X(t1), X2=X(t2), , Xk=X(tk),
are jointly Gaussian random variables for all k, and all
choices of t1, t2, , tk.
A Gaussian process is strictly and weakly stationary
because the normal distribution is uniquely characterized
by its first two moments.
35
36
(5-17)
Start with Zn being white noise or purely random with mean and second
moment 2
X0 = 0
Xn =Xn-1 + Zn = ! Z n
k=0
(5-18a)
(5-18b)
38
Random Walk -2
Below is a general figure of a random walk.
39
k = 0, 1, 2,!
(3-19a)
40
Poisson
arrivals
ti
t1
X (t )
Y(t) = (-1)X(t)
t
Y (t )
+1
t1
Let p(k) denote the probability the number of points in the interval. Using the Poisson PMF
it is easy to see that the probability that the number of points is even or odd is given by
2
"
% ! !t
(
!
t)
! !t
P[X(t) = 1] = p(0) + p(2) +... = e $1+
+...' = e cosh ! t
2!
#
&
3
"
% ! !t
(! t)
! !t
P[X(t) = !1] = p(1) + p(3) +... = e $! t +
+...' = e sinh ! t
3!
#
&
! !t
! !t ! !t
=e
!2 !t
(5-20)
41
(5-21a)
(5-21b)
(5-21c)
42
(5-22)
43
1 n=k
(k ) = (xt x )(xt + k x )
n t =1
(5-23)
(k ) =
(x
x )(xt + k x )
t =1
(x
x)
(k )
.
(0)
(5-24)
t =1
44
(x
x )(xt + k x )
t =1
2
(
)
x
x
t
(k )
.
(0)
(5-25)
t =1
45
46
Stochastic Continuity
47
(5-26)
48
49
Stochastic Continuity
(5-27)
50
Stochastic Continuity
Almost-Surely Continuous
(5-28)
51
Stochastic Convergence
A random sequence or a discrete-time random process is a
sequence of random variables {X1(), X2(), , Xn(),} =
{Xn()}, .
For a specific , {Xn()} is a sequence of numbers that might or
might not converge. The notion of convergence of a random
sequence can be given several interpretations.
52
53
Stochastic Convergence
54
(5-29)
55
56
Mean-square Convergence
(5-30)
57
Convergence in Probability
(5-31)
Convergence in Distribution
59
Remarks
60
(5-32)
61
(5-33)
62
(5-34)
63
64