Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Comm ch02 Random en PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 84

Principles of Communications

Meixia Tao

Dept. of Electronic Engineering


Shanghai Jiao Tong University
Chapter 2: Signal, Random Process, and Spectra
Selected from Chapter 2.1-2.6, Chapter 5.1-5.3 of Fundamentals
of Communications Systems, Pearson Prentice Hall 2005, by
Proakis & Salehi
1
Signal and Noise in
Communication Systems

Information Output
Source Transmitter Channel Receiver Signal

uncertain Noise

Meixia Tao @ SJTU 2


Topics to be Covered

2.1.Signals

2.2.Review of probability and random variables

2.3.Random Processes: basic concepts

2.4.Guassian and White Processes

Meixia Tao @ SJTU 3


What is Signal?
 In communication systems, a signal is any function that
carries information. Also called information bearing signal

Meixia Tao @ SJTU 4


Classification of Signals (1/4)
 Continuous-time signal vs. discrete-time signal

 Continuous valued signal vs. discrete-valued signal

o Continuous-time and continuous valued: analog signal


o Discrete-time and discrete valued: digital signal
o Discrete-time and continuous valued: sampled signal
o Continuous-time and discrete valued: quantized signal

Meixia Tao @ SJTU 5


Classification of Signals (2/4)

Meixia Tao @ SJTU 6


Classification of Signals (3/4)
 Deterministic signal vs. random signal

Meixia Tao @ SJTU 7


Classification of Signals (4/4)
 Energy signal vs. power signal
∞ T /2
 Energy
= ∫
= ∫
2
Ex x (t ) dt lim x(t ) 2 dt
T →∞
−∞ −T /2
T /2
1
T →∞ T ∫
 Power Px = lim x (t ) 2
dt
−T /2

 A signal is an energy signal iff Ex is finite


 A signal is a power signal iff Px is finite

Meixia Tao @ SJTU 8


Fourier Transform

Meixia Tao @ SJTU 9


Topics to be Covered

2.1.Signals

2.2.Review of probability and random variables

2.3.Random Processes: basic concepts

2.4.Guassian and White Processes

Meixia Tao @ SJTU 10


Example 1
 Consider a binary communication system

P(0) = 0.3 P01 = P(receive 1 | sent 0) = 0.01


P00 = P(receive 0 | sent 0) = 1- P01 = 0.99
P10 = P(receive 0 | sent 1) = 0.1
P(1) = 0.7
P11 = P(receive 1 | sent 1) = 1- P10 = 0.9

 What is the probability that the output of this channel is 1?


 Assuming that we have observed a 1 at the output, what is
the probability that the input to the channel was a 1?

Meixia Tao @ SJTU 11


Conditional Probability

 Consider two events A and B


 Conditional probability P(A|B)
 Joint probability

P( AB ) = P( B ) P( A | B ) = P( A) P( B | A)

 A and B are said statistically independent iff

Meixia Tao @ SJTU 12


Law of Total Probability
 Let be mutually exclusive events
with

 For any event B, we have

Meixia Tao @ SJTU 13


Bayes’ Theorem
 Let be mutually exclusive events such
that and B is an arbitrary event with nonzero
probability. Then

 Example 1 on slide 11?

Meixia Tao @ SJTU 14


Random Variables (r.v.)
 A r.v. is a mapping from the sample space S to the set of
real numbers.

 A r.v. may be
 Discrete-valued: range is finite (e.g. {0,1}), or countable
infinite (e.g. {1,2,3 …})
 Continuous-valued: range is uncountable infinite (e.g. )

Meixia Tao @ SJTU 15


 The Cumulative distribution function (CDF of a r.v. X, is

FX ( x ) = P ( X ≤ x )

 Key properties of CDF


1. 0 ≤ FX(x) ≤ 1 with
2. FX(x) is a non-decreasing function of x
3. F (x1 < X ≤ x2) = FX(x2) − FX(x1)

16
Probability Density Function (PDF)
 The PDF, of a r.v. X, is defined as

f X ( x ) = FX ( x ) FX ( x ) = ∫ f X ( y )dy
d x
or
dx −∞

 Key properties of PDF



1. p X ( x) ≥ 0 2. ∫
−∞
p X ( x)dx = 1
x2
3. P ( x1 < X ≤ x2 ) = PX ( x2 ) − PX ( x1 ) = ∫
x1
p X ( x)dx

Area = P( x1 < X ≤ x2 )
FX ( x ) f X (x )
1 1
Area = f X ( x )dx
P( x1 < X ≤ x2 )

Xmin 0 x1 x2 Xmax x Xmin 0 x1 x2 x x+dx Xmax x


Meixia Tao @ SJTU 17
Some useful distribution:
Bernoulli Distribution
 A discrete r.v taking two possible values, X = 1 or X = 0.
with probability mass function (pmf)

 Often used to model binary data

Meixia Tao @ SJTU 18


Some useful distribution:
Binomial Distribution
 A discrete r.v. taking the sum of n-independent Bernoulli
r.v. , i.e.
where

 The PMF is given by

where

 That is, the probability that Y = k is the probability that k of


the Xi are equal to 1 and n-k are equal to 0

19
Example
 Suppose that we transmit a 31-bit long sequence with
error correction capability up to 3 bit errors
 If the probability of a bit error is p = 0.001, what is the
probability that this sequence is received in errors?

 If no error correction is used, the error probability is

Meixia Tao @ SJTU 20


Example
 Assume 10,000 bits are transmitted over a channel in
which the error probability is
 What is the probability that the total number of errors is
less than 3?
 Solution:
 n=10,000, p=0.001

Meixia Tao @ SJTU 21


Some useful distribution:
Uniform Distribution
 A continuous r.v. taking values between a and b with equal
probabilities
 The probability density function (pdf) is given by

 The random phase of a sinusoid is often modeled as a


uniform r.v. between 0 and

Meixia Tao @ SJTU 22


Statistical Averages
 Consider a discrete r.v. which takes on the possible values
x1, x2, …, xM with respective probabilities P1, P2, …, PM.
 The mean (均值) or expected value (期望值) of X is
M
m X = E [X ] = ∑ xi Pi
i =1

 If X is continuous, then

mX = E [ X ] = ∫ xf X ( x )dx

−∞

 This is the first moment (一阶矩) of X.

Meixia Tao @ SJTU 23


 The nth moment of X

[ ]= ∫
EX n

−∞
x n p X ( x )dx

 Let n = 2, we have the mean-square value (均方值) of X

[ ]= ∫
EX 2
−∞

x 2 p X ( x )dx

Meixia Tao @ SJTU 24


 n-th Central moment is
[
E ( X − mX ) = ∫
n
] ∞

−∞
( x − mx ) f X ( x )dx
n

 At n=2, we have the variance (方差)

σ X2 = E [( X − mX )2 ]
[
= E X 2 − 2mX X + m2X ]
= E [X ] − m
2 2
X

 σX is called the standard deviation (标准偏差)


 It is the average distance from the mean, a measure of the
concentration of X around the mean

Meixia Tao @ SJTU 25


Some useful distribution:
Gaussian Distribution
 Gaussian or normal distribution (正态分布) is a continuous
r.v. with pdf
fX(x)
1  1 
f X ( x) = exp− 2 ( x − mX )2 
2πσ X2  2σ X 
x
0 mX

 A Gaussian r.v. is completely determined by its mean and


variance, and hence usually denoted as
(
X ~ N m X , σ X2 )
 By far the most important distribution in communications
Meixia Tao @ SJTU 26
The Q-Function
 The Q-function is a standard form to express error
probabilities without a closed form
∞ 1  u2 
Q( x ) = ∫ exp − du
x 2π  2
 The Q-function is the area under the tail of a Gaussian pdf
with mean 0 and variance 1

 Extremely important in error probability analysis!!!


27
More about Q-Function
 Q-function is monotonically decreasing
 Some features

 Craig’s alternative form of Q-function (IEEE MILCOM’91)

 Upper bound

 If we have a Gaussian variable X ~ N ( µ , σ 2 ) , then


 x−µ 
Pr( X > x) =
Q 
 σ 
Meixia Tao @ SJTU 28
Joint Distribution
 Consider 2 r.v.’s X and Y, joint distribution function is
defined as
FXY ( x, y ) = P( X ≤ x, Y ≤ y )

∂ 2 FXY ( x, y )
and joint PDF is f XY ( x, y ) =
∂x∂y

 Key properties of joint distribution


∞ ∞
∫ ∫
−∞ −∞
p XY ( x, y )dxdy = 1
y2 x2
P( x1 < X ≤ x2 , y1 < Y ≤ y2 ) = ∫ ∫ p XY ( x, y )dxdy
y1 x1

Meixia Tao @ SJTU 29


 Marginal distribution
∞ x
PX ( x) = P( X ≤ x, − ∞ < Y < ∞) = ∫ ∫ p XY (α , β )dαdβ
−∞ −∞

y ∞
PY ( y ) = ∫ ∫ p XY (α , β )dαdβ
−∞ −∞

 Marginal density

p X ( x) = ∫ p XY ( x, β )dβ
−∞

 X and Y are said to be independent iff


PXY ( x, y ) = PX ( x) PY ( y )
p XY ( x, y ) = p X ( x) pY ( y )

Meixia Tao @ SJTU 30


Correlation (1/2)
 Correlation (相关) of the two r.v. X and Y is defined as

RXY = E [ XY ] = ∫
∞ ∞

−∞ −∞ ∫ xyf XY ( x, y )dxdy

 Correlation of the two centered r.v. X-E[X] and Y-E[Y], is


called the covariance (协方差) of X and Y

 If , i.e. , then X and Y are called


uncorrelated.
31
Correlation (2/2)
 The covariance of X and Y normalized w.r.t. σX σY is
referred to the correlation coefficient of X and Y:

 If X and Y are independent, then they are uncorrelated.


 The converse is not true (except the Gaussian case)

32
Joint Gaussian Random Variables
 X1, X2, …, Xn are jointly Gaussian iff

 x is a column vector
 m is the vector of the means
 C is the covariance matrix

Meixia Tao @ SJTU 33


Two-Variate Gaussian PDF
 Given two r.v.s: X1 and X2 that are joint Gaussian

 Then

Meixia Tao @ SJTU 34


 For uncorrelated X and Y, i.e.

X1 and X2 are also independent

If X1 and X2 are Gaussian and uncorrelated,


then they are independent.
Meixia Tao @ SJTU 35
Some Properties of Jointly Gaussian r.v.s

 If n random variables are jointly Gaussian,


any set of them is also jointly Gaussian. In particular, all
individual r.v.s are Gaussian
 Jointly Gaussian r.v.s are completely characterized by the
mean vector and the covariance matrix, i.e. the second-
order properties
 Any linear combination of is a Gaussian
r.v.

Meixia Tao @ SJTU 36


Law of Large Numbers
 Consider a sequence of r.v.
 Let

 If Xi’s are uncorrelated with the same mean and


variance

 Then

the average converges to the expected value

Meixia Tao @ SJTU 37


Central Limit Theorem
 If are i.i.d random variables with common
mean and common variance
 Then,
converges to a

the sum of many i.i.d random variables converges to


a Gaussian random variable

 Thermal noise results from the random movement of many


electrons – it is well modeled by a Gaussian distribution.

Meixia Tao @ SJTU 38


Example

Meixia Tao @ SJTU 39


Meixia Tao @ SJTU 40
Meixia Tao @ SJTU 41
Meixia Tao @ SJTU 42
Meixia Tao @ SJTU 43
Meixia Tao @ SJTU 44
Some useful distribution:
Rayleigh Distribution

0.7

0.6

0.5

0.4

0.3

0.2

0.1

0
0 1 2 3 4 5 6

 Rayleigh distributions are frequently used to model fading


for non-line of sight (NLOS) signal transmission
 Very important for mobile and wireless communications

Meixia Tao @ SJTU 45


Exercise
 Let , where a and b are i.i.d Gaussian random
variables with mean 0 and variance
 Show that the magnitude of h follows Rayleigh distribution

and its phase follows a uniform distribution

Meixia Tao @ SJTU 46


Topics to be Covered

2.1.Signals

2.2.Review of probability and random variables

2.3.Random Processes: basic concepts

2.4.Guassian and White Processes

Meixia Tao @ SJTU 47


Random Process
 A random process (stochastic process, or random signal)
is the evolution of random variables over time

Meixia Tao @ SJTU 48


Description of Random Process
 : random process
 : sample function of the random process
 : values of the random process at

x1(t)
Outcome of 1st
experiment
x2(t) Outcome of
Sample 2nd
space S experiment
xn(t) X(t1) X(t2)
Outcome of nth
t1 t2 experiment

Meixia Tao @ SJTU 49


Example
 Uniformly choose a phase between and generate a
sinusoid with a fixed amplitude and frequency but with a
random phase .
 In this case, the random process is

Meixia Tao @ SJTU 50


Statistics of Random Processes
 An infinite collection of random variables specified at time t

 Joint pdf

Meixia Tao @ SJTU 51


First Order Statistics
 = first order density of

 Mean
E [ X (t0 )] = E [ X (t = t0 )] = ∫ xf X ( x; t0 ) = X (t0 )

−∞

 Variance

Meixia Tao @ SJTU 52


Second-Order Statistics
 = second-order density of

 Auto-correlation function (correlation within a process):

Meixia Tao @ SJTU 53


Example 1
 Consider , where is uniform in

 Mean:

 Auto-correlation:

Meixia Tao @ SJTU 54


Example 2
 Consider , where

 Find its mean and auto-correlation function

Meixia Tao @ SJTU 55


Stationary Processes (平稳过程)
 A stochastic process is said to be stationary if for any
and :
(1)

 First-order statistics is independent of



= E { X (t )} ∫= xf X ( x)dx mX (2)
−∞

 Second-order statistics only depends on the gap


∞ ∞
= RX (t1 , t2 ) ∫ ∫ x1 x2 f X ( x1 , x2 , t2 − t1 )dx1dx2
−∞ −∞

=RX (t2 − t1 ) =RX (τ ) , where τ =t2 − t1 (3)

Meixia Tao @ SJTU 56


Wide-Sense Stationary (广义平稳)
 A random process is said to be WSS when:

E { X (t )}
= ∫=xf ( x)dx m
X X
(2)
−∞
∞ ∞
RX (t1 , t2 ) ∫ ∫ x x f (x , x ,t
−∞ −∞ 1 2 X 1 2 2 − t1 )dx1dx2
=RX (t2 − t1 ) =RX (τ ) , where τ =t2 − t1 (3)

 A random process is strictly stationary (严格平稳) when:

(1)

Meixia Tao @ SJTU 57


Wide-Sense Stationary
 Example 1:

From the previous example:

X(t) is WSS

 Example 2: , where
Is Y(t) WSS?

Meixia Tao @ SJTU 58


Averages and Ergodic
 Ensemble averaging

X (t ) = E [X (t )] = ∫ xp( x; t )dx

−∞

RX (t1 , t 2 ) = E [X (t1 ) X (t 2 )] = ∫
∞ ∞

−∞ −∞ ∫ x1 x2 p( x1 , x2 ; t1 , t 2 )dx1dx2

 Time averaging ∆ 1 T
T →∞ 2T ∫−T
< X (t ) > = lim x(t )dt
∆ 1 T
T →∞ 2T ∫−T
< X (t ) X (t − τ ) > = lim x(t ) x(t − τ )dt

 If ensemble average = time average, X(t) is said to be Ergodic


(各态历经)
So what?
Meixia Tao @ SJTU 59
Random process

Wide sense stationary

Strictly
Ergodic
stationary

 Applications
 Signal: WSS
 Noise: (strictly) stationary
 Time-varying channel: ergodic => ergodic capacity
Meixia Tao @ SJTU 60
Frequency Domain Characteristics of
Random Process

Power Spectral Density


(功率谱密度)

frequency

time

Meixia Tao @ SJTU 61


Power Spectral Density
 Given a deterministic signal
 Power:
 Truncate to get an energy signal
 x(t ) (t <T)
xT (t ) = 
. 0 (其余)

 According to Parseval theorem

 Thus,

Power spectral density of x(t)


Meixia Tao @ SJTU 62
PSD of Random Process
 Consider as a sample function of a random process
 Average power of

 Power spectral density of

Meixia Tao @ SJTU 63


Exercise
 Given a binary semi-random signal

 is a rectangular pulse shaping function with width


 is a random variable that takes +1 or -1 with equal probability,
and it is independent for different n
 Sample functions of X(t) are
x(t, ei ) Tb

0 t

 Find its power spectral density


x(t, ei +1 )

0 t

Meixia Tao @ SJTU 64


PSD of WSS Process
 Wiener-Khinchin theorem


RX (τ ) = ∫ S X ( f ) exp( j 2πfτ )df
S X ( f ) ↔ RX (τ ) −∞

S X ( f ) = ∫ RX (τ ) exp(− j 2πfτ )dτ
−∞

 Property:

RX (0 ) = ∫ S X ( f )df = total power
−∞

Meixia Tao @ SJTU 65


Example
 For the random process
 We had

 Hence

Meixia Tao @ SJTU 66


Exercise
 Given a binary random signal

 is a rectangular pulse shaping function with width


 is a random variable that takes +1 or -1 with equal probability,
and it is independent for different n
 is a random time delay uniformly distributed within
 A typical sample function of X(t) is
x(t)

T τ
1
x(t, ei+1 )

0 t1 t2 t

Tb

Meixia Tao @ SJTU 67


 Find the autocorrelation function

 Find its power spectral density

Meixia Tao @ SJTU 68


Random Process Transmission Through
Linear Systems
 Consider a linear system (channel)

X(t) Impulse Y(t)


response
h(t)

Y (t ) = X (t ) * h(t ) = ∫ h(τ ) X (t − τ )dτ
−∞

 Mean of the output Y(t)

If X(t) is WSS

Meixia Tao @ SJTU 69


Random Process Transmission Through
Linear Systems
 Autocorrelation of Y(t)

If X(t) is WSS

If input is a WSS process, the output is also a


WSS process

Meixia Tao @ SJTU 70


Relation Among the Input-Output PSD

 Autocorrelation of Y(t)

 PSD of Y(t):

SY ( f ) = H ( f ) S X ( f )
2

Meixia Tao @ SJTU 71


Deterministic vs. Random

Deterministic h(t)
signal

WSS Random
h(t)
signal

SY ( f ) = H ( f ) S X ( f )
2

Meixia Tao @ SJTU 72


Example

Differentiator

 Let the random pass through the above


system, then

Meixia Tao @ SJTU 73


Topics to be Covered

2.1.Signals

2.2.Review of probability and random variables

2.3.Random Processes: basic concepts

2.4.Gaussian and White Processes

Meixia Tao @ SJTU 74


Gaussian Process
 Definition:
 is a Gaussian process if for all and all
the sample values have a joint
Gaussian density function

 Properties:
 If it is wide-sense stationary, it is also strictly stationary
 If the input to a linear system is a Gaussian process, the
output is also a Gaussian process

Meixia Tao @ SJTU 75


Noise
 Often modeled as Gaussian and stationary with zero mean

 White noise
N0
Sn ( f ) = N 0 2 Rn (τ ) = δ (τ )
2
White noise is completely
uncorrelated!

=
N 0 = 4.14 ×10−21
KT
Meixia Tao @ SJTU 76
= −174 dBm/Hz
Bandlimited Noise
White noise Filter Bandwidth Bandlimited white
B Hz noise n(t)

Q1. At what rate to sample the noise can


we get uncorrelated realizations?

Q2. What is the power of each sample?

Meixia Tao @ SJTU 77


Band-pass Noise

White noise Band-pass noise


Bandpass filter n(t)

 Canonical form of a band-pass noise process

In-phase component Quadrature component

Low-pass noise process

Meixia Tao @ SJTU 78


Properties
 Let n(t) be a zero-mean, stationary and Gaussian noise
 Then nc(t) and ns(t) satisfy the following properties
 Property 1:
 nc(t) and ns(t) are zero-mean, jointly stationary and jointly Gaussian
process

Meixia Tao @ SJTU 79


 Property 2:
 S n ( f − f 0 ) + S n ( f + f 0 ), f ≤ B / 2
S=
nc ( f ) S=ns ( f ) 
0 otherwise

 Proof
HL ( f )

× Z1 (t ) 1 nc (t )

-B/2 0 B/2 f
2cos ω0t
n(t )
−2sin ω0t
HL ( f )

× Z2 (t ) 1 ns (t )

-B/2 0 B/2 f

Meixia Tao @ SJTU 80


Envelop and Phase
 Angular representation of n(t)
=n(t ) R(t ) cos [ω0t + φ (t ) ]

=
R(t ) nc 2 (t ) + ns 2 (t ) envelop
where ns (t )
=φ (t ) tan −1 [0 ≤ φ (t ) ≤ 2π ] phase
nc (t )

1 R(t )

n(t ) B

0 t

1

f0

Meixia Tao @ SJTU 81


 Let n(t) be a zero-mean, stationary Gaussian process,
find the statistics of the envelop and phase
 Result:
 Envelop follows Rayleigh distribution while phase
follows uniform distribution
2π R  R2 
f ( R) =∫0 f ( R, φ )dφ =σ 2 exp − 2σ 2  R≥0

∞ 1
Proof? f=(φ ) ∫ f ( R, φ= )dR 0 ≤ φ ≤ 2π
0 2π
 For the same t, the envelop variable R and phase
variable φ are independent (but not the two processes)

Meixia Tao @ SJTU 82


Summary
 For WSS: S X ( f ) ↔ RX (τ )

 WSS transmission through a linear system

h(t)
SY ( f ) = H ( f ) S X ( f )
2

 Gaussian, stationary, and zero-mean noise

 non-Gaussian noise?
 Snapping shrimp noise in shallow water acoustic communication

Meixia Tao @ SJTU 83


Suggested Reading & Homework
 Chapter 2.1-2.6, Chapter 5.1-5.3 of Fundamentals
of Communications Systems, Pearson Prentice
Hall 2005, by Proakis & Salehi

 Homework #1:
 Textbook Ch2: 2.5, 2.9, 2.13
 Textbook Ch5: 5.5, 5.10 (Hint: Q-fun), 5.22, 5.28,
5.40, 5.58

Meixia Tao @ SJTU 84

You might also like