Comm ch02 Random en PDF
Comm ch02 Random en PDF
Comm ch02 Random en PDF
Meixia Tao
Information Output
Source Transmitter Channel Receiver Signal
uncertain Noise
2.1.Signals
2.1.Signals
P( AB ) = P( B ) P( A | B ) = P( A) P( B | A)
A r.v. may be
Discrete-valued: range is finite (e.g. {0,1}), or countable
infinite (e.g. {1,2,3 …})
Continuous-valued: range is uncountable infinite (e.g. )
16
Probability Density Function (PDF)
The PDF, of a r.v. X, is defined as
∆
f X ( x ) = FX ( x ) FX ( x ) = ∫ f X ( y )dy
d x
or
dx −∞
Area = P( x1 < X ≤ x2 )
FX ( x ) f X (x )
1 1
Area = f X ( x )dx
P( x1 < X ≤ x2 )
where
19
Example
Suppose that we transmit a 31-bit long sequence with
error correction capability up to 3 bit errors
If the probability of a bit error is p = 0.001, what is the
probability that this sequence is received in errors?
If X is continuous, then
mX = E [ X ] = ∫ xf X ( x )dx
∞
−∞
[ ]= ∫
EX n
∞
−∞
x n p X ( x )dx
[ ]= ∫
EX 2
−∞
∞
x 2 p X ( x )dx
−∞
( x − mx ) f X ( x )dx
n
σ X2 = E [( X − mX )2 ]
[
= E X 2 − 2mX X + m2X ]
= E [X ] − m
2 2
X
Upper bound
∂ 2 FXY ( x, y )
and joint PDF is f XY ( x, y ) =
∂x∂y
y ∞
PY ( y ) = ∫ ∫ p XY (α , β )dαdβ
−∞ −∞
Marginal density
∞
p X ( x) = ∫ p XY ( x, β )dβ
−∞
RXY = E [ XY ] = ∫
∞ ∞
−∞ −∞ ∫ xyf XY ( x, y )dxdy
32
Joint Gaussian Random Variables
X1, X2, …, Xn are jointly Gaussian iff
x is a column vector
m is the vector of the means
C is the covariance matrix
Then
Then
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0 1 2 3 4 5 6
2.1.Signals
x1(t)
Outcome of 1st
experiment
x2(t) Outcome of
Sample 2nd
space S experiment
xn(t) X(t1) X(t2)
Outcome of nth
t1 t2 experiment
Joint pdf
Mean
E [ X (t0 )] = E [ X (t = t0 )] = ∫ xf X ( x; t0 ) = X (t0 )
∞
−∞
Variance
Mean:
Auto-correlation:
(1)
X(t) is WSS
Example 2: , where
Is Y(t) WSS?
−∞
∆
RX (t1 , t 2 ) = E [X (t1 ) X (t 2 )] = ∫
∞ ∞
−∞ −∞ ∫ x1 x2 p( x1 , x2 ; t1 , t 2 )dx1dx2
Time averaging ∆ 1 T
T →∞ 2T ∫−T
< X (t ) > = lim x(t )dt
∆ 1 T
T →∞ 2T ∫−T
< X (t ) X (t − τ ) > = lim x(t ) x(t − τ )dt
Strictly
Ergodic
stationary
Applications
Signal: WSS
Noise: (strictly) stationary
Time-varying channel: ergodic => ergodic capacity
Meixia Tao @ SJTU 60
Frequency Domain Characteristics of
Random Process
frequency
time
Thus,
0 t
0 t
∞
RX (τ ) = ∫ S X ( f ) exp( j 2πfτ )df
S X ( f ) ↔ RX (τ ) −∞
∞
S X ( f ) = ∫ RX (τ ) exp(− j 2πfτ )dτ
−∞
Property:
∞
RX (0 ) = ∫ S X ( f )df = total power
−∞
Hence
T τ
1
x(t, ei+1 )
0 t1 t2 t
Tb
If X(t) is WSS
If X(t) is WSS
Autocorrelation of Y(t)
PSD of Y(t):
SY ( f ) = H ( f ) S X ( f )
2
Deterministic h(t)
signal
WSS Random
h(t)
signal
SY ( f ) = H ( f ) S X ( f )
2
Differentiator
2.1.Signals
Properties:
If it is wide-sense stationary, it is also strictly stationary
If the input to a linear system is a Gaussian process, the
output is also a Gaussian process
White noise
N0
Sn ( f ) = N 0 2 Rn (τ ) = δ (τ )
2
White noise is completely
uncorrelated!
=
N 0 = 4.14 ×10−21
KT
Meixia Tao @ SJTU 76
= −174 dBm/Hz
Bandlimited Noise
White noise Filter Bandwidth Bandlimited white
B Hz noise n(t)
Proof
HL ( f )
× Z1 (t ) 1 nc (t )
-B/2 0 B/2 f
2cos ω0t
n(t )
−2sin ω0t
HL ( f )
× Z2 (t ) 1 ns (t )
-B/2 0 B/2 f
=
R(t ) nc 2 (t ) + ns 2 (t ) envelop
where ns (t )
=φ (t ) tan −1 [0 ≤ φ (t ) ≤ 2π ] phase
nc (t )
1 R(t )
≈
n(t ) B
0 t
1
≈
f0
∞ 1
Proof? f=(φ ) ∫ f ( R, φ= )dR 0 ≤ φ ≤ 2π
0 2π
For the same t, the envelop variable R and phase
variable φ are independent (but not the two processes)
h(t)
SY ( f ) = H ( f ) S X ( f )
2
non-Gaussian noise?
Snapping shrimp noise in shallow water acoustic communication
Homework #1:
Textbook Ch2: 2.5, 2.9, 2.13
Textbook Ch5: 5.5, 5.10 (Hint: Q-fun), 5.22, 5.28,
5.40, 5.58