TELE9754 L1-ProbTheory
TELE9754 L1-ProbTheory
TELE9754 L1-ProbTheory
Wei Zhang
E-mail: w.zhang@unsw.edu.au
T3 2024
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 1 / 33
Outline
Probability Theory
Random Process
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 2 / 33
Outline
Probability Theory
Random Process
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 3 / 33
Random Signals
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 4 / 33
Probability and Random Variables
Relative-Frequency Approach
I The relative frequency is a nonnegative real number less
than or equal to one.
nA
0≤ ≤1
n
I The experiment exhibits statistical regularity if, for any
sequence of n trials, the relative frequency converges to a
limit as n becomes large.
nA
P(A) = lim
n→∞ n
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 5 / 33
Sample Space
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 6 / 33
Random Variables
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 7 / 33
Random Variables
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 8 / 33
Distribution Function
F X (x) = P[X ≤ x]
F X (x1 ) ≤ F X (x2 ), if x1 ≤ x2
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 9 / 33
Probability Density Function
Probability density function is denoted by
d
fX (x) = F X (x)
dx
Three basic properties:
1. Since the distribution function is monotone nondecreasing,
it follows that the density function is nonnegative for all
values of x.
2. The distribution function may be recovered from the
density function by integration, as shown by
Z x
F X (x) = fX (s)ds
−∞
3. Property 2 implies that the total area under the curve of the
density function is unity.
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 10 / 33
Uniform Distribution
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 11 / 33
Joint Random Variables
Joint distribution function F X,Y (x, y)
∂2 F X,Y (x, y)
fX,Y (x, y) =
∂x∂y
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 12 / 33
Conditional Probability
P[Y|X] denotes the probability of Y given that X has occurred.
P[X, Y]
P[Y|X] =
P(X)
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 13 / 33
Expectation
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 14 / 33
Variance
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 15 / 33
Covariance
where Z ∞ Z ∞
E[XY] = xy fX,Y (x, y)dxdy
−∞ −∞
If X and Y are independent, it has
Z ∞Z ∞
E[XY] = xy fX (x) fY (y)dxdy = E[X]E[Y]
−∞ −∞
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 16 / 33
Transformation of Random Variables
!
y−b y−b
FY (y) = P[Y < y] = P[aX + b < y] = P[X < ] = FX
a a
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 17 / 33
Gaussian Random Variables
2
1 − (x−µ)2
fX (x) = √ e 2σ
2πσ2
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 18 / 33
Gaussian Random Variables
For the special case of a Gaussian random variable with µ = 0
and σ = 1, called the normalized Gaussian RV, the pdf is
1 x2
fX (x) = √ e− 2
2π
Its distribution function is
Z x Z x
1 s2
F X (x) = fX (s)ds = √ e− 2 ds
−∞ 2π −∞
Q function is the complement of the normalized Gaussian
distribution function, given by
Z ∞ 2
1 s
Q(x) = 1 − F X (x) = √ e− 2 ds
2π x
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 19 / 33
Gaussian Distribution
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 20 / 33
Q Function
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 21 / 33
The Central Limit Theorem
Suppose
1. The Xk with k = 1, 2, 3, · · · , N are statistically independent.
2. The Xk all have the same probability density function.
3. Both the mean and the variance exist for each Xk .
Let
N
X
Y= Xk
k=1
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 22 / 33
The Central Limit Theorem
Computer Experiment :
We consider the random variable
N
X
Z= Xk
k=1
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 23 / 33
The Central Limit Theorem
Probability Theory
Random Process
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 25 / 33
Random Process
Random processes have the following properties:
I Random processes are functions of time.
I Random processes are random in the sense that it is not
possible to predict exactly what waveform will be observed
in the future.
Suppose that we assign to each sample point s a function of
time with the label
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 26 / 33
Random Process
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 27 / 33
Some Concepts
RX (t, s) = RX (t − s)
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 28 / 33
Wide-sense Stationary Random Process
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 29 / 33
Properties of Autocorrelation Function
Property 2 Symmetry
RX (τ) ≤ RX (0)
for any τ.
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 30 / 33
Autocorrelation
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 31 / 33
Ergodic Process
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 32 / 33
Reference
Wei Zhang UNSW TELE9754 Coding & Information Theory - Probability Theory 33 / 33