Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
264 views

Topic 5 - Intro To Random Processes

This document provides an introduction to random or stochastic processes over two weeks. Key points: 1) A random process can be described as a collection of time functions or random variables observed at different times, where the value at one time may depend on values at previous times. 2) Examples of random processes include channel noise in communications, information generated by a source, and interference. Other examples are daily stream flow, hourly rainfall, stock indexes. 3) Random processes can be classified as discrete-state, discrete-time, continuous-time with discrete states, or continuous-time with continuous states.

Uploaded by

Hamza Mahmood
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
264 views

Topic 5 - Intro To Random Processes

This document provides an introduction to random or stochastic processes over two weeks. Key points: 1) A random process can be described as a collection of time functions or random variables observed at different times, where the value at one time may depend on values at previous times. 2) Examples of random processes include channel noise in communications, information generated by a source, and interference. Other examples are daily stream flow, hourly rainfall, stock indexes. 3) Random processes can be classified as discrete-state, discrete-time, continuous-time with discrete states, or continuous-time with continuous states.

Uploaded by

Hamza Mahmood
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 64

Topic 5: Stochastic/Random Processes Introduction

(Read Chapter 9 and some sections in Chapter 10] in Papoulis) -2 weeks

A general random, or stochastic, process can be described as:


Collection of time functions (signals) corresponding to various outcomes
of random experiments (events).
Collection of random variables observed at different times.

Rather than consider fixed random variables X, Y, etc. or even


sequences of i.i.d random variables, we consider sequences X0, X1,
X2, . Where Xt represent some random quantity at time t.
In general, the value Xt might depend on the quantity Xt-1 at time t-1, or
even the value Xs for other times s < t.
Examples of random processes in communications:
Channel noise,
Information generated by a source,
Interference.

Examples in other fields: Daily stream flow, hourly rainfall of storm


events , stock index,
Note: we will use the terms stochastic and random process interchangeably

Random Processes
A RANDOM VARIABLE X, is a rule for assigning to every
outcome, , of an experiment a number X().
Note: X denotes a random variable and X() denotes a particular value.

A RANDOM PROCESS X(t) is a rule for assigning to every , a


function X(t,).
Note: for notational simplicity we often omit the dependence on .
An example of a stochastic process X(t) is shown below
a random variable for each fixed t
X(t)

a sample path

time

Random Process
A random processes can be either discrete-time or continuous-time.
A random processes can be either discrete valued or continuous valued.
Time average = 1

X ( , t ) Random
Process

1+n(t)
1

=1

Time average = 0

=0

0+n(t)
Random
Event

Ensemble average = 0.5


3

Classification of Stochastic Processes


Four classes of stochastic processes:

Discrete-state process chain


Discrete-time process stochastic sequence {Xn | n T} (e.g.,
probing a system every 10 ms.)
Continuous time and discrete state quantized signal
Continuous time and continuous state white noise

Types of Random Processes

Random Process for a Continuous Sample Space

Stochastic/Random Processes Are Important


Signals (processes)
Deterministic signal.
One possible value for any time
instance. Therefore, we can
predict the exact value of the
signal for a desired time instance.

Random (stochastic):
Many (infinitely) possible values for
any time instance. Therefore, we can
predict only the expected value of
the signal for a desired time
instance.
Stock market, speech, medical
data, communication signals,

Most of the real life signals (observations) are contaminated by random


noise.
Noise contamination may lead to unpredictable changes in the parameters
(e.g., amplitude and phase) of the signal.
Many information-bearing signals are random (the data bits are random).

Stochastic Processes

Chapter 4a

(a bit more formally)

" A deterministic model predicts a single outcome from a give


set of circumstances;
" A stochastic model predicts a set of possible outcomes
weighted by their likelihood or probabilities.
" A stochastic process is a special stochastic model that deal
with a class of random variables.
A stochastic process {X(t), t T} is a collection of random
variables X(t) indexed by t. That is, for each t T, X(t) is a
random variable, with t often interpreted as time and the values
of X(t) are referred to as states of the process (especially if the
values of X(t) are discrete).
e.g.,

X(t) = the received signal voltage


X(t) = number of customers in a supermarket at time t;
X(t) = the quantity of a commodity in inventory at time t.

Stochastic Processes-2

Chapter 4a

The set T is called the index set of the process. When T is


countable, the stochastic process is said to be a discrete-time
process; if T is an interval of the real line, it is said to be a
continuous-time process.
The set of possible values that the random variable X(t) can assume
is called the state space of the stochastic process.
If the state space is countable, X(t) is a discrete-state process.
Otherwise it is a continuous-state process.

Thus, a stochastic process is a family of random variables that


describes the evolution through time of some (physical) process.
Stochastic processes are classified by the type of state space, by the
index set T, and the dependence relations among the random variables
X(t) and their distributions --- giving the names such as Gaussian,
Markov, and Poisson Process.
9

Random Processes
Recall that a random variable, X, is
a rule for assigning to every
outcome, , of an experiment a
number X().
Note: X denotes a random
variable and X() denotes a
particular value.
A random process X(t) is a rule for
assigning to every , a function X
(,t) .
Note: for notational simplicity
we often omit the dependence
on .
Analytical description X(t) =X(t,)
where is an outcome of a random
event.
Statistical description: for any
integer n and any choice of (t1,
t2, . . ., tn) the joint PDF of (X(t1), X
( t2), . . ., X( tn) ) is known.

10

Example: Analytical Description ---Random Phase Sine Wave


Let X(t) = A cos(2f0t +) where is a random variable uniformly
distributed on [0,2). A is a fixed amplitude.
Complete statistical description of X(to) is:
Let Y = 2f0t + [which is a uniform RV] $
Then, we need to transform from y to x:
pX(x)dx = pY(y1)dy + pY(y2)dy
(5-1a)
We need both y1 and y2 because for a given x the equation x=A
cos y has two solutions in [0,2).
Since
dx
(5-1b)
2
2
dy

= A sin y =

A x

Since pY is uniform in [2f0t, 2f0t + 2], we get


1
"
$
p X ( x ) = % A2 x 2
$0
&

A < x < A
(5-1c)

elsewhere

11

Example: Statistical Description---Gaussian Random Process


Suppose a random process X(t) has the property that for
any n and (t0,t1, . . .,tn) the joint density function of {x(ti)}
is a jointly distributed Gaussian vector with zero mean
and covariance
2

ij = min (ti , t j )

(5-2)

This gives complete statistical description of the random


process X(t).
The Gaussian Random Process is unique in that the first
and second-order statistics completely describe the
process.

12

Random Processes Basic Concepts


fX(x)
x(t)

time, t

The univariate probability density function describes the general distribution of


the magnitude of the random process, but it gives no information on the time or
frequency content of the process.
We will use the multi-variate PDF/PMF, as well as other parameters such as
correlation function and power spectral density to describe the Random Process.

13

Probability Distribution of a Random Process


For any stochastic process with index set I, its probability
distribution function is uniquely determined by its finite dimensional
distributions.
The k dimensional distribution function of a process is defined by
FX t

,..., X t k

(x1 ,..., xk ) = P(X t

x1 ,..., X tk xk

(5-3)

for any t1 ,..., t k I and any real numbers x1, , xk .


The distribution function tells us everything we need to know about
the process {Xt }.
However, in many cases the joint distribution function is not
available and so other metrics are used to describe a random
process.
14

Moments of Stochastic Process


We can describe a stochastic process via its moments, i.e.,
E ( X t ), E (X t2 ), E ( X t X s ) etc. We often use the first two moments.
The mean function of the process is E ( X t ) = t .
The variance function of the process is Var ( X t ) = t2 .

(5-4a)
(5-4b)

The covariance function between Xt , Xs is

Cov( X t , X s ) = E (( X t t )( X s s ))

(5-4c)

The correlation function between Xt , Xs is


(X t , X s ) =

Cov ( X t , X s )

2
t

2
s

These moments are often functions of time, but not always


(stationarity)

(5-4d)

15

Example: Consider the complex RV

Z=

"

T
!T

X(t)dt.

(5-5a)

Then the second moment of Z is given by


2

E[| Z | ] = E[ZZ*] =
=

!T

!T

" "

!T

!T

" "

E{X(t1 )X (t2 )}dt1 dt2

RXX (t1, t2 )dt1 dt2

(5-5b)

16

Stationary Random Processes


Means of collecting statistics:
Sample records which are individual representations of the underlying
process.
Ensemble averaging: properties of the process are obtained by
averaging over a collection or ensemble of sample records using
values at corresponding times
Time averaging: properties are obtained by averaging over a single
record in time.
Stationary Processes ----ensemble averages do not vary with time
Ergodic process: stationary process in which averages from a single
record are the same as those obtained from averaging over the
ensemble.
Strictly stationarity: the joint PDF is time invariant
(Second-order)Wide-sense stationary (WSS): the first and second
moments are time invariant.

17

Ergodic Random Process


(ergodic !stationary)
Definition: A random process is ergodic if all time averages of any sample
function are equal to the corresponding ensemble averages
Example, for ergodic processes, can use ensemble statistics to compute:
DC values
RMS values
Ergodic processes are always stationary; Stationary processes are not necessarily
ergodic.
Example: X(t) = A sin(2f0t +)
A and 0 are constants; 0 is a uniformly distributed RV from [-,); t is time.

Mean (Ensemble statistics)

mx = x = x ( ) f ( ) d =

x 2 =

1
A sin (0t + )
d = 0 (5-6a)
2

2
1
A
A2 sin 2 (0t + )
d =
2
2

(5-6b)
18

Example: Ergodic Process


Mean (Time Average) T is large
x (t )

1
= lim
T T

A sin (0t + ) dt = 0

(5-7a)

Variance
x2 (t )

1 T 2 2
A2
= lim A sin (0t + ) dt =
0
2
T T

(5-7b)

The ensemble and time averages are the


same, at least for the the moments
considered, so the process is ergodic
19

Strictly Stationary Processes


A process is said to be strictly stationary if (X t ,..., X t
joint distribution as (X t + ,..., X t + ) . That is, if
1

FX t

,..., X t k

) has the same

(x1 ,..., xk ) = FX

t1 + ,..., X t k +

(x1 ,..., xk )

(5-8a)

If {Xt } is a strictly stationary process and E (X t2 ) < then, the mean


function is a constant and the variance function is also a constant.
Moreover, for a strictly stationary process with first two moments
finite, the covariance function, and the correlation function depend
only on the time difference s.
This means that the process behaves similarly (follows the same
PDF) regardless of when you measure it.
Is the random process from the coin tossing experiment stationary?

20

Wide Sense Stationary (WSS) Random Process


Strict stationarity is often too strong a condition in practice. It is often
difficult to assess based on an observed time series x1,,xk.
We often use a weaker sense of stationarity in terms of the moments of
the process.
A process is said to be nth-order wide-sense stationary if all its joint
moments up to order n exists and are time invariant, i.e., independent of
time origin.
For example, a second-order wide sense stationary process will have
constant mean and variance, with the covariance and the correlation
being functions of the time difference alone.
A strictly stationary process with the first two moments finite is also a
second-ordered weakly stationary.

21

Example of First-Order Stationarity

()

X t = Acos ! 0t + ! 0

(5-8b)

Recall that the PDF of x(t):


(5-1c)
1
"
xA
$
2
2
f x ( x ) = & A x
$0
x elsewhere
'
Note: there is NO dependence on time, this is stationary.
This result applies to problems in which there is a random start
up phase of an unsynchronized oscillator.

22

Example: Non-stationary Random Process


X(t)

PDF of X(t)

Time, t

23

Wide-Sense Stationary (WSS)


Usually, t1=t and t2=t+, so that t1- t2 =$
A random process is Wide-Sense Stationary if:
The mean E[x(t)] = constant
The autocorrelation only depends on the time difference,
RXX (t1,t2) = RXX (t2-t1) = RXX ()
Some properties of the autocorrelation function
_________

Rx 0 = E[x 2 (t)] = x 2 t = second moment ="signal power"

()
R (! ) = R (!! )
R (0) ! R (! )

()

(5-9a)

symmetry

(5-9b)

peak at zero offset

(5-9c)

24

Autocorrelation of a WSS Random Process

Given two random variables X(t1) and X(t2), a measure of linear relationship between
them is specified by E[X(t1)X(t2)]. For a random process, t1 and t2 go through all
possible values, and therefore, E[X(t1)X(t2)] can change and is a function of t1 and t2.
The autocorrelation function of a random process is thus defined by
" "

( )

Rx t1 ,t2 = E[x(t1 )x(t2 )] = E[x(t2 ! t1 )] =

# # x x f ( x , x ) dx dx
1 2

(5-10)

!" !"

Correlation or second moment of a real random process with itself at two time
instants (shown below separated by sec).

x(t)

time, t

T
25

Autocorrelation of WSS RP
1
R()

0
Time lag,

Typically, the autocorrelation for a random process eventually decays to


zero at large

26

Example: Random Sine Wave (fixed amplitude and random phase)

X (t ) = a cos(0t + ),

~ U (0,2 ).

(5-11a)

This gives

(t ) = E{ X (t )} = aE{cos( 0t + )}
= a cos 0t E{cos } a sin 0t E{sin } = 0,
X

since E{cos } = 21

(5-11b)

cos d = 0 = E{sin }.

Similarly

R XX (t1 , t2 ) = a 2 E{cos( 0 t1 + ) cos( 0 t2 + )}


a2
= E{cos 0 (t1 t2 ) + cos( 0 (t1 + t2 ) + 2 )}
2
a2
= cos 0 (t1 t2 ).
(5-11c)
2
27

Cyclostationary Random Process


(Chapter 10 Papoulis)

A random process X(t) is cyclostationary if both the mean, mx(t),


and the autocorrelation function, RX(t, ), are periodic in t with
some period T0: i.e., if

f X (t ), X (t ),..., X (t ) x1 , x2 ,..., xn
1
2
n

= f X (t +T ), X (t +T ),..., X (t +T ) x1 , x2 ,..., xn
1
0
2
0
n
0

(5-12a)

So that

mX (t +T0 ) = mX (T0 )
Rx (t, ) = Rx (t + T0 , )

(5-12b)
(5-12c)

for all t and .


28

Example: Cyclostationary Random Process


An important example of a cyclostationary random process is the random pulse
amplitude modulated (PAM) signal shown below, where the an are independent
random (data) samples that are transmitted every T seconds and h(t) is the pulse
that is amplitude modulated and shapes the bandwidth of the signal:
"

X(t) =
Consider the correlation function
"

# a h(t ! nT )
n

n=!"
"

E[X(t)X(t + ! ) = E[( # an h(t ! nT ) # am h(t + ! ! mT ))


n=!"

(5-13a)

m=!"

(5-13b)

= E[## an am h(t ! nT )h(t + ! ! mT )] = ## E[an am ]h(t ! nT )h(t + ! ! mT )


n

Since the an are independent random variables, E[anam] = 0 unless n = m. And


letting E[(an)2] =A, we can simplify the above expression to

E[X(t)X(t + ! ) = !! E[an am ]h(t " nT )h(t + ! " mT ) = A! h(t " nT )h(t + ! " nT )
n m
n
(5-13c)
which is easily seen to be periodic in T.
Question: What is the E [X(t)]?
29

Jointly Wide Sense Stationary Processes

Continuous-time random process X(t) and Y(t) are jointly wide


sense stationary if X(t) and Y(t) are both wide sense stationary,
and the cross-correlation depends only on the time difference
between the two random variables RXY(t,) = RXY().
Random Sequences Xn and Yn are jointly wide sense stationary
if Xn and Yn are both wide sense stationary, and the crosscorrelation depends only on the index difference between the
two random variables RXY[m,k] = RXY[k].

30

Cross Correlation of Two Random Processes


Definition of the Cross Correlation between X(t) and Y(t)
_________________

( )

()( )

" "

Rxy t1 ,t2 = E[ X (t1 )Y (t2 )] = x t y t = # # x1 y2 f xy x1 , y2 dx1 dy2 (5-14a)


1
2
!" !"

Definition Jointly Stationary

Rxy (t1 , t2 ) = Rxy (t2 t1 ) = Rxy ( )

(5-14b)

31

Jointly Stationary Properties


Properties

Rxy ( ) = Rxy ( )
Rxy ( ) Rx ( 0 ) Ry ( 0 )
Rxy ( ) 12 $& Rx ( 0 ) + Ry ( 0 )%'

(5-15a)
(5-15b)
(5-15c)

__________________

Uncorrelated:

Rxy ( ) = x (t ) y (t + ) = x y

Orthogonal:

Rxy ( ) = 0

(5-15d)
(5-15e)

Independent: if X(t1) and Y(t2) are independent (joint


distribution is product of individual distributions)
32

Power Spectral Density


(much more later)
________________
2

Definition

$ X ( )
T
S x ( ) = lim &
T &
T
(

%
'
'
)

(5-16a)

Relationship to autocorrelation function

Rx ( ) S x ( )

(5-16b)

Power of a Random Process

1
Px = Rx ( 0 ) =
2

s ( ) d = 2 s ( f ) df
x

(5-16c)

33

Partial Autocorrelation Function


Often we want to investigate the dependency / association between
Xt and Xt+k adjusting for their dependency on Xt+1, Xt+2,, Xt+k-1.
The conditional correlation Corr(Xt , Xt+k | Xt+1, Xt+2,, Xt+k-1) is
usually referred to as the partial correlation in time series analysis.
Partial autocorrelation is usually useful for identifying
autoregressive models (more later in the course).

34

Gaussian Process
As noted earlier
A random process {X(t)} is said to be a Gaussian random
process if all finite collections of the random process,
X1=X(t1), X2=X(t2), , Xk=X(tk),
are jointly Gaussian random variables for all k, and all
choices of t1, t2, , tk.
A Gaussian process is strictly and weakly stationary
because the normal distribution is uniquely characterized
by its first two moments.

35

White Noise Processes


A process {Xt} is called white noise process if it is a sequence of
uncorrelated random variables from a fixed distribution with
constant mean (usually assume to be 0) and constant variance 2.
A white noise process is stationary with an autocorrelation functions
given by the delta function.
A white noise process is Gaussian if its joint distribution is normal.

36

White Noise Process and LTI Systems


W(t) is said to be wide-sense stationary (w.s.s) white noise if E[W(t)]
= constant, and

RWW (t1 , t2 ) = q (t1 t2 ) = q ( ).

(5-17)

The Power Spectral Density of white noise is flat as a function of


frequency (where W)
[w/Hz]

If W(t) is also a Gaussian process, then all of its samples are


independent RVs
Colored noise
LTI
White noise
N (t ) = h (t ) W (t )
h(t)
W(t)
What is the power in a white noise process?
37

Random Walk (Chapter 10)

Intuitive definition: A series of movement which direction and size are


randomly decided (e.g., the path taken by a drunk person).

Start with Zn being white noise or purely random with mean and second
moment 2

Xn is a (discrete time) random walk if

X0 = 0

Xn =Xn-1 + Zn = ! Z n

k=0

(5-18a)
(5-18b)

So, the process Xn is the sum of all of the increments.

It can be shown that the random walk is not stationary, since


E[Xn] = n

and Var [Xn] = 2 n

But, the first order differences are stationary, since


Xn - Xn-1 =Zn

Random walks are often applied to model diffusion processes, thermal


noise, stock variations.

38

Random Walk -2
Below is a general figure of a random walk.

39

Autocorrelation Function of Poisson Processes


Recall that the Poisson random variable X(t) is defined by

" X(t) ! n(0, t) = k, that is, k arrivals occur in the% ( !t (! t)k


P#
,
&=e
k!
$ interval (0, t) seconds
'

k = 0, 1, 2,!
(3-19a)

To determine the autocorrelation function RXX (t1,t2), we let t2 > t1


Since X(t1) and X(t2) - X(t1) dont overlap in time, they are independent RVs.
Recall that the mean of X(t) is t and the second moment is E[X2(t)] =t + (t)2, so
E{X(t1) [X(t2)-X(t1)]} = t1 (t2 - t1)
(5-19a)
We can use the identity
X(t1) X(t2) = X(t1) [ X(t1) + X(t2)-X(t1)] = [X(t1)]2 + X(t1) [X(t2)-X(t1)]
(5-19b)
Now we use the independence of X(t1) and X(t2) - X(t1) to calculate
RXX (t1,t2) = E[X(t1)]2 + E{X(t1) [X(t2)-X(t1)]}= t1+(t1)2 + t1(t2 - t1)
= t1 +2t1t2
for t2 > t1
= t2 +2t1t2
for t1 > t2 [by symmetry]
(5-19c)
= 2t1t2 + min (t1,t2)
Notice that the Poisson process X(t) does not represent a wide sense stationary
process.

40

Poisson
arrivals

Example: The Random Telegraph Wave

Define a binary level process called


the random telegraph signal

ti

t1

X (t )

Y(t) = (-1)X(t)
t

where the transition points are driven


by the Poisson process X(t).
Note that X(t) = 1 if the number of
points in the interval (0,t) is even.

Y (t )
+1

t1

Let p(k) denote the probability the number of points in the interval. Using the Poisson PMF
it is easy to see that the probability that the number of points is even or odd is given by
2
"
% ! !t
(
!
t)
! !t
P[X(t) = 1] = p(0) + p(2) +... = e $1+
+...' = e cosh ! t
2!
#
&
3
"
% ! !t
(! t)
! !t
P[X(t) = !1] = p(1) + p(3) +... = e $! t +
+...' = e sinh ! t
3!
#
&
! !t

! !t ! !t

( E[X(t)] = e (cosh ! t ! sinh ! t) = e e

=e

!2 !t

(5-20)
41

Example: The Random Telegraph Wave-2


To determine RYY (t1,t2) we note that if t = t1 - t2 >0 and X(t2) =1, then X(t1) =1
if the number of points in the interval (t1,t2) is even. Note that Y(t) = 1.
Note that

P[X(t1 ) = 1 | X(t2 ) = 1] = e! !t cosh ! t , t = t1 ! t2


Multiplying by P[X(t2 ) = 1] gives

(5-21a)

P[X(t1 ) = 1, X(t2 ) = 1] = e! !t cosh ! te! !t2 cosh ! t2

(5-21b)

Going through all the combinations, we find that

RYY (t1, t2 ) = e!2 !|t1!t2 |

(5-21c)

42

Estimation of the Mean of a Random Process


Often the joint PDF/PMF are not available.
One way to estimate the statistics of the underlying RP are to use the
sample functions.
Given a single realization {xt} of a stationary process {Xt}, a natural
estimator of the mean E ( X t ) = is the sample mean
1 n
x = xt
n t =1
which is the time average of n observations.

(5-22)

It can be shown that the sample mean is unbiased and consistent


estimator for .

43

Estimation of the Autocovariance Function


Given a single realization {xt} of a stationary process {Xt}, the
sample autocovariance function given by

1 n=k
(k ) = (xt x )(xt + k x )
n t =1

(5-23)

is an estimate of the auto-covariance function.


Similarly, the autocorrelation can be estimated by
n=k

(k ) =

(x

x )(xt + k x )

t =1

(x

x)

(k )
.

(0)

(5-24)

t =1

44

Sample Autocorrelation Function


For a given time series {xt}, the sample autocorrelation function is
n=k
given by
(k ) =

(x

x )(xt + k x )

t =1

2
(
)
x

x
t

(k )
.

(0)

(5-25)

t =1

The sample autocorrelation function is non-negative definite.


The sample autocovariance and autocorrelation functions have the
same properties as the autocovariance and autocorrelation function
of the entire process.

45

Complex Process and Vector Processes


Definitions

The complex process Z(t) = X(t) + jY(t) is specified in terms of the


joint statistics of the real processes X(t) and Y(t).

The vector process (n-dimensional process) is a family of n


stochastic processes.

46

Stochastic Continuity

47

Stochastic Mean-Square Continuity

(5-26)

48

Stochastic Mean-Square Continuity -2

49

Stochastic Continuity

(5-27)

50

Stochastic Continuity
Almost-Surely Continuous

(5-28)

51

Stochastic Convergence
A random sequence or a discrete-time random process is a
sequence of random variables {X1(), X2(), , Xn(),} =
{Xn()}, .
For a specific , {Xn()} is a sequence of numbers that might or
might not converge. The notion of convergence of a random
sequence can be given several interpretations.

52

Sure Convergence (Convergence Everywhere)


The sequence of random variables {Xn()} converges surely to the
random variable X() if the sequence of functions Xn() converges to
X() as n for all , i.e.,
Xn() X() as n for all .

53

Stochastic Convergence

54

Almost-sure Convergence (Convergence with probability 1)

(5-29)

55

Almost-sure Convergence (Convergence with probability 1)

56

Mean-square Convergence

(5-30)

57

Convergence in Probability

(5-31)

Convergence in Distribution

59

Remarks

Convergence with probability one applies to the


individual realizations of the random process.
Convergence in probability does not.
The weak law of large numbers is an example of
convergence in probability.
The strong law of large numbers is an example of
convergence with probability 1.
The central limit theorem is an example of convergence
in distribution.

60

Weak Law of Large Numbers (WLLN)

(5-32)

61

Strong Law of Large Numbers (SLLN)

(5-33)

62

The Central Limit Theorem

(5-34)

63

Venn Diagram of Relation of Types of Convergence

Note that even sure


convergence may not imply
mean square convergence.

64

You might also like