Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
3 views

Lecture13_RandomProcesses

Lecture 13 of ECE 3614 covers the significance of randomness in communication systems, emphasizing the modeling of signals and noise as random processes. It explains the concepts of random variables, sample functions, and the importance of time and ensemble averages in describing random processes. The lecture also introduces stationary and ergodic random processes, autocorrelation, and the relationship between power spectral density and random processes.

Uploaded by

hzhengm
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Lecture13_RandomProcesses

Lecture 13 of ECE 3614 covers the significance of randomness in communication systems, emphasizing the modeling of signals and noise as random processes. It explains the concepts of random variables, sample functions, and the importance of time and ensemble averages in describing random processes. The lecture also introduces stationary and ergodic random processes, autocorrelation, and the relationship between power spectral density and random processes.

Uploaded by

hzhengm
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

ECE 3614: Introduction to

Communications Systems
Lecture 13: Random Processes

Lingjia Liu, Ph.D.

Associate Professor
Electrical and Computer Engineering
Virginia Tech, Blacksburg, VA
Importance of “Randomness”

● Random variables and processes let us talk about quantities and


signals which are unknown in advance
● The data sent through a communication system is modeled as random
(otherwise no information is sent!)
● The noise, interference, and fading introduced by the channel can all
be modeled as random processes
● Even the measure of performance (probability of bit error) is
expressed in terms of a probability
● Reading: Sections 4.3
Random Processes

● A random variable has a single value


● We are concerned with signals which change with time
● Random variables model unknown events
● Random processes model unknown signals
● Definition: A random process is an indexed set of functions of
some parameter (usually time) that has certain statistical
properties
● A random process can also be thought of as an indexed set of
random variables
● If !(#) is a random process then !(1), !(1.5),and !(37.5)
are all random variables for specific times #
Random Processes

● A specific instance of a random process is termed a sample


function
● The value of a random process X(t1) is a random variable.
● Thus, a random process is an indexed set of random variables
that have a specific cross-correlation and distribution that are
determined by the underlying function
● We deal with ensemble averages and time averages
– An ensemble average is the expected value of all possible
sample functions each sampled at a specific time to
– A time average is the mean of a specific sample function
over all time
Time vs. Ensemble Averages

● Time Average: hold random variable constant and average over all
time
T /2
1
X (t ) X ( t ) dt
T ®¥ T ò
= lim
-T / 2

● Ensemble Average: hold time constant and average over all values
of the random variable

X t =( ) ∫ xf (x) dx
X
−∞

x = X (t )
Description of a Random Process

● To completely describe a random process we require an N-dimensional


pdf
f ( X ( t1 ) , X (t2 ) , X (t3 ) ...X (t N ) )

where N ®¥
● Most random processes of interest can be described more simply.
Example: Gaussian Random Process
Value at t=65 is Gaussian RV
5
Voltage

0
Four sample functions
-5
0 10 20 30 40 50 60 70 80 90 100
5
Voltage

-5
0 10 20 30 40 50 60 70 80 90 100
5
Thermal noise is
Voltage

0 Gaussian Random
-5
Process
0 10 20 30 40 50 60 70 80 90 100
5
The value at any time
Voltage

0
sample is a Gaussian
-5 Random Variable
0 10 20 30 40 50 60 70 80 90 100
Time Sample
Example: Sine with random phase

x(t ) = A sin(wot + q )
● Let A and wo be known
● q is a random variable uniformly distributed on [0,2p)
● x1(t)=Asin(wot + p/5) is a sample function
● The value at any time t1 is a random variable with distribution

# 1
% x ≤A
f (x) = $ π A2 − x 2
%
& 0 else
Example (cntd.)
Value at t=4 is RV x
1
Four sample functions
0

-1
1
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 x = X(t=4)
0

-1 Probability distribution of x
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
1 10

9
0
8

-1 7
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
1 6

5
0
4

-1 3
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
2
Time (sec)
1

fo = 1, A=1 0
-1 -0.5 0 0.5 1
x
Stationary Random Processes

● A stationary random process has statistical properties (joint pdfs) which


do not change with time
f ( X ( t1 ) , X ( t2 ) ,...X (t N ) ) = f ( X (t1 + to ) , X (t2 + to ) ,...X (t N + to ) )

– First order – fX(x1) where x1=x(t1) does not depend on the value of t1
– Second order – fX1X2(x1, x2) where x1=x(t1), x2=x(t2) doesn’t depend on the
values of t1 and t2 but only on the difference t = t1 - t2
● A wide sense stationary (WSS) process has a mean and autocorrelation
function which do not change with time (this is usually sufficient for the
analysis of communication systems)
1. E [ x(t1 )] = X
2. E [ x(t1 ) x(t2 )] = E [ x(t ) x(t + t )]
= RX (t )
Ergodic Random Processes

● A random process is ergodic if the time average always converges to


the statistical average.
– i.e., we can use time averages of a sample function to estimate the
ensemble averages
– In real life we can not obtain a sufficient number of sample
functions, so we rely on time averages of a single sample function.
● Unless specified, we will assume that all random processes are WSS
and ergodic.
● Note that all ergodic processes are stationary, but not all stationary
processes are ergodic.
Description of Random Processes

● Knowing the pdf of individual samples of the random process is not


sufficient. We also need to know how individual samples are related
to each other (ideally the joint pdf)
● Two tools which are simpler but still give valuable information are
available to describe this relationship:
– Autocorrelation function
– Power spectral density function
Autocorrelation

● Autocorrelation measures how a random process changes with


time.
● Intuitively, !(1) and !(1.1) will be more strongly related than
!(1) and !(100000) (although it is possible to construct
counterexamples). The autocorrelation function quantifies this.
● Definition:

( )
RX t1 ,t2 = E[x(t1 )x(t2 )]
¥ ¥
=ò ò x1 x2 f X (x1 , x2 )dx1dx2
-¥ -¥
Autocorrelation

● A Wide-sense stationary (WSS) process is one in which the first and


second ensemble averages are stationary or independent of time
● For a WSS process, the autocorrelation does not depend on the exact
values of t1 and t2, rather it depends only on the difference t = t1-t2

RX ( t1 , t2 ) = RX (t )
= E éë X ( t ) X ( t + t )ùû

● Note that signal power is RX(0)


– Recall that power is simply the mean squared value
Properties of WSS Processes

RX (0) = x2 (t ) i.e., RX(0)= the power

RX (t ) = RX (-t ) RX(t) is even

The correlation peaks at zero


RX (0) ≥ RX (τ )
time offset
Spectra of Random Processes

● We are interested in random processes whose properties do not


change with time
– Stationary random processes
● If a random process is stationary, it must last forever (there is no
time when its properties change so the probabilities cannot go to
zero)
● A signal which lasts forever (i.e., infinite in time) is a power
signal
● Thus, we can describe the spectral properties through the
power spectral density
Spectra of Random Processes

● The power spectral density of a random process Sx(f) can be determined


from the auto-correlation function Rx(t) through the Fourier Transform.
● In other words they form a Fourier Transform pair:

● The Fourier Transform of an individual sample function is not


particularly helpful
● However if we average the magnitude squared of the Fourier Transform
of all possible sample functions we approach the Power Spectral
Density
For more details: Please read Section 4.3
Filtering

● If we filter a random process X(t), the output Y(t) is also a random


process
● The power spectral density of the resulting random process SY(f) can
be determined as
2
SY ( f )= S X ( f ) H ( f )

where H(f) is the transfer function of the filter


Gaussian Random Processes
● A GRP is a collection of N random variables with distribution
1 T
- (1/ 2)[( x -m ) C-1 ( x -m )]
f x ( x) = 1/ 2
e
(2p ) N / 2 det C

● X is a N-dimensional Gaussian Random Variable, m is the mean


vector, and C is the covariance matrix.
● Gaussian Random Processes have several special properties:
– If a Gaussian random process is wide-sense stationary, then it is
also stationary.
– Any sample point from a Gaussian random process is a Gaussian
random variable
– If the input to a linear system is a Gaussian random process, then
the output is also a Gaussian random process
Complex Random Processes
● Complex random processes are useful for describing bandpass
signals in complex baseband notation. A complex random process is
represented as
– !(#) = &(#) + ( )(#)
where x(t) and y(t) are real random processes
● Complex random processes are similar to real random processes with
minor differences in definitions, e.g.,

Rg (t 1 , t2 ) = g * (t1 )g (t2 )
● A WSS complex random process is one in which

Rg ( t1 , t2 ) = Rg (t )
Summary

● Random variables can be used to form models of a communications


system
● Random processes are used to model random signals which represent the
waveforms in a communication system.
● Random processes are simply collections of indexed random variables
● Sampling random processes results in random variables. We will typically
work with random variables.
● However, it is important for you to understand that these random variables
come from random processes.
● Thus, it is important that you have a good understanding of random
variables (and in more advanced study random processes) to fully grasp
communications systems

You might also like