Course 09-2 - Discrete Time Random Signals
Course 09-2 - Discrete Time Random Signals
Until now, we have assumed that the signals are deterministic, i.e., each value of a sequence is uniquely determined. In many situations, the processes that generate signals are so complex as to make precise description of a signal extremely difficult or undesirable. A random or stochastic signal is considered to be characterized by a set of probability density functions.
A continuous-time random signal (or random process) is a signal x(t) whose value at each time point is a random variable. Random signals appear often in real life. Examples include:
1. The noise heard from a radio receiver that is not tuned to an operating channeL 2. The noise heard from a helicopter rotor. 3. Electrical signals recorded from a human brain through electrodes put in contact with the skull (these are called electroencephalograms, or EEGs).
4. Mechanical vibrations sensed in a vehicle moving on a rough terrain. 5. Angular motion of a boat in the sea caused by waves and wind. 6. Television signal 7. Radar signal
Stochastic Processes
A random process is an indexed family of random variables characterized by a set of probability distribution function. A sequence x[n], <n< . Each individual sample x[n] is assumed to be an outcome of some underlying random variable Xn. Difference between a single random variable and a random process for a random variable the outcome of a randomsampling experiment is mapped into a number, whereas for a random process the outcome is mapped into a sequence.
Probability density function of x[n]: p (xn , n ) Joint distribution of x[n] and x[m]: p (xn , n , xm , m )
Eg., x1[n] = Ancos(wn+n), where An and n are random variables for all < n < , then x1[n] is a random process.
p (x n , n , x m , m ) = p (x n , n ) p (x m , m )
p (x n + k , n + k , x m + k , m + k ) = p (x n , n , x m , m )
for all k. That is, the joint distribution of x[n] and x[m] depends only on the time difference m n.
Stationary (continue)
In particular, the above definition is also applicable to the situation of m=n. Hence, a stationary random process should also satisfy
p (x n + k , n + k ) = p (x n , n )
That is, the pdf of a stationary process is not varying with the time index n. It implies that x[n] is shift invariant.
In many of the applications of discrete-time signal processing, random processes serve as models for signals in the sense that a particular signal can be considered a sample sequence of a random process. Although such a signals are unpredictable making a deterministic approach to signal representation is inappropriate certain average properties of the ensemble can be determined, given the probability law of the process.
Statistics: Expectation
m xn = {xn } =
xn p (xn , n )dxn
Dependent with n
{xn y m } = {xn } {y m }
{ xn } =
2
xn p (xn , n )dxn
2
Variance
2 var {xn } = xn m xn
Autocorrelation
xx
=
{n,m} = {x x }
n m
Autocovariance
xx {n,m} = xn m xn xm m xm
= xx {n,m} m xn m xm
{(
)(
)* }
Stationary Process
According to the definition of stationary process, the autocorrelation of a stationary process is dependent only on the time difference m n. Hence, for stationary process, we have
m x = m xn = {xn } = (x n m x )
2 x
Independent to n
xx (n + k , n ) = xx (k ) =
xn + k xn
Wide-sense Stationary
In many instances, we encounter random processes that are not stationary in the strict sense that the pdf should remain the same for all time. Instead, only the statistics (usually up to the 2-nd order) are invariant with time. To relax the definition, if the following equations hold, we call the process wide-sense stationary (w. s. s.).
m x = m xn = {xn } = (x n m x )
2 x
xx (n + k , n ) = xx (k ) = xn + k xn
Example of stationary
The following random signal is w.s.s. stationary. As we see, a WSS signal looks more or less the same at different time intervals. Although its detailed form varies, its overall (or macroscopic) shape does not.
Example of nonstationary
An example of a random signal that is not stationary is a seismic wave during an earthquake. As we see, the amplitude of the wave shortly before the beginning of the earthquake is small. At the start of the earthquake the amplitude grows suddenly, sus-tains its amplitude for a certain time, then decays.
Magnitude of the Fourier transform of the previous w.s.s. stationary signal segment. It contains not much information for a random signal by taking its F. T. directly. Later, we will see the power density spectrum, which is better for capturing the frequency domain for random signals.
Time Averages
For any single sample sequence x[n], define their time average to be
x[n + m]x[n]
Ergodic Process
The time average is defined for a deterministic signal sampled from the random process. A stationary random process for which time averages equal ensemble averages is called an ergodic process:
x[n] = m x
x[n + m]x[n]
= xx [m]
It is common to assume that a given sequence is a sample sequence of an ergodic random process, so that averages can be computed from a single sequence.
1 L 1 x = m x[n ] L n =0
In practice, we cannot
compute with the limits, but instead the quantities on the right-hand side
L 1 1 2 x )2 (x[n] m x = L n =0
Definition:
Property 1:
xx [m] = xx [m] mx
2 y
xy [m] = xy [m] mx m
Property 2:
xx [0] = E xn
2 x
xx [0] = = Variance
Property 3:
xx [ m]
[m] [m] xx [ m] = xx
= xx
xy [ m]
[m] [m] xy [ m] = xy
= xy
case
xx [m] xx [0]
xx [m] xx [0]
If
y n = xn n0
yy [m] = xx [m]
yy [m] = xx [m]
Since autocorrelation and autocovariance sequences are all (aperiodic) one-dimensional sequences, there Fourier transform exist and are bounded in |w|. Let the Fourier transform of the autocorrelation and autocovariance sequences be
xx [m] xx e jw xx [m] xx
jw
( ) (e )
xy [m] xy e jw xy [m] xy
jw
( ) (e )
Consequently,
Denote Pxx (w) = xx e to be the power spectral density; power density spectrum (or power spectrum) of the random process x.
( )
jw
Hence, we have
x[n]
{ }
2
1 = 2
Pxx (w)dw
The total area under power density in [,] is the total energy of the signal. Pxx(w) is always real-valued since xx(n) is conjugate symmetric For real-valued random processes, Pxx(w) = xx(ejw) is both real and even.
In addition, from
x[n]
{ }
2
1 = 2
Pxx (w)dw
Pxx(w) can be treated as the density at the frequency w of the total energy. Integrating all the densities from to then constitutes the total energy of a w.s.s. random signal. This is why Pxx(w) is called power spectral density.
Power spectral density of the previous w.s.s. stationary signal segment. That is, the F. T. of its autocorrelation function. It looks smooth and well behalved than direct F. T.
Consider a linear system with impulse response h[n]. If x[n] is a stationary random signal with mean mx, then the output y[n] is also a stationary random signal with mean my equaling to
k =
m y [n] = {y[n]} =
m y = mx
( )
k =
If x[n] is a real and stationary random signal, the autocorrelation function of the output process is
yy [n , n + m] = {y[n]y[n + m]}
h[k ]h[r ]x[n k ]x[n + m r ] = k = r =
k =
Since x[n] is stationary , {x[nk]x[n+mr] } depends only on the time difference m+kr.
Therefore,
yy [n , n + m]
=
k =
h[k ] h[r ] xx [m + k r ]
r =
= yy [m]
The output power density is also stationary. Generally, for a LTI system having a wide-sense stationary input, the output is also wide-sense stationary.
By substituting l = rk,
yy [m] =
=
where
l =
l =
xx [m l ]chh (l )
k =
chh [l ] =
h[k ]h[l + k ]
yy e
( ) = C (e ) (e )
jw jw jw hh xx
where Chh(ejw) is the Fourier transform of chh[l]. For real h, Correlation of a[n]
chh [l ] = h[l ] h[ l ]
2
Thus
C hh e jw = H e jw H e jw
C hh
jw jw
( ) ( ) ( ) (e ) = H (e )
We have the relation of the input and the output power spectrums to be the following:
yy e
( ) = H (e ) (e )
jw jw 2 jw xx
1 jw e dw = total average power of the input x[n] = xx [0] = xx 2 2 1 2 jw jw H e e dw y[n] = yy [0] = xx 2 = total average power of the output
2
{ } { }
( ) ( ) ( )
We have seen that Pxx(w)=xx(ejw) can be viewed as density. Key property: The area over a band of frequencies, wa<|w|<wb, is proportional to the power in the signal in that band. To show this, we can use the linear-system property above. Consider an ideal band-pass filter. Let H(ejw) be the frequency of the ideal band pass filter for the band wa<|w|<wb. Note that |H(ejw)|2 and xx(ejw) are both even functions. Hence,
wa
b
He
( ) (e )
jw 2 jw xx
1 dw + 2
wb
a
He
( ) (e )dw
jw 2 jw xx
xx e
( )=
The average power of a white-noise is therefore 1 1 2 2 jw xx [0] = x dw = x xx e dw = 2 2 White noise is also useful in the representation of random signals whose power spectra are not constant with frequency.
( )
A random signal y[n] with power spectrum yy(ejw) can be assumed to be the output of a linear time-invariant system with a white-noise input.
yy e
( ) = H (e )
jw jw 2
2 x
Cross-correlation
The cross-correlation between input and output of a LTI system: [m] = {x[n]y[n + m]}
xy = x[n] h[k ]x[n + m k ] k =
That is, the cross-correlation between the input output is the convolution of the impulse response with the input autocorrelation sequence.
k =
h[k ] xx [m k ]
Cross-correlation (continue)
By further taking the Fourier transform on both sides of the above equation, we have xy e jw = H e jw xx e jw This result has a useful application when the input is white noise with variance x2.
( ) ( ) ( )
2 h[m], xy [m] = x
2 H e jw xy e jw = x
( )
( )
These equations serve as the bases for estimating the impulse or frequency response of a LTI system if it is possible to observe the output of the system in response to a white-noise input.