LECT3 Probability Theory
LECT3 Probability Theory
Digital Communication
Similarly,
Random Variables:
The outcome of a random experiment
or
2. FX() = 1
3. FX(-) = 0
Properties of pdf:
fX(x) 0
1.
for all x
2.
f ( x)dx = 1
f ( x)dx = F () F () = 1 0 = 0
3.
F ( x) =
f ( x)dx
11
FXY ( x, y ) = P( X x, Y y )
y x
XY
( x, y )dxdy
12
P( x1 X x2 , y1 Y y2 ) =
y 2 x2
XY
( x, y )dxdy
y1 x1
13
XY
( x, y )dxdy = 1
FX ( x) = P( X x, Y ) =
XY
( x, y )dxdy
14
d
f X ( x) =
FX ( x) = f XY ( x, y )dy
dx
i.e. the pdf of a single random variable can be obtained from its
joint pdf with a second random variable
Similarly
fY ( y ) =
XY
( x, y )dx
i.e., the pdf fX(x) is obtained from the joint pdf fXY(x,y) by simply
integrating it over all possible values of the undesired random
variable
The pdfs fX(x) and fY(y) are called MARGINAL DENSITIES
15
Conditional pdf:
The conditional pdf of Y given that X = x is defined by
f XY ( x, y )
fY ( y | x) =
f X ( x)
Properties:
1. fY(y|x) 0
2.
( y | x)dy = 1
16
y2
x2
P ( x1 X x2 , y1 Y y2 ) = f X ( x)dx fY ( y )dy
y1
x1
17
Statistical Average:
Let us consider the problem of determining the average height of
the entire population of a country
If the data is recorded within the accuracy of an inch, then the
height X of every person will be approximated to one of the n
numbers x1, x2, , xn.
If there are Ni persons of height xi, then the average height is given
by
N x + N 2 x 2 + K + N n xn
X= 1 1
N
N = total number of the persons
Nn
N1
N2
X=
x1 +
x2 + K +
xn
N
N
N
18
X = xi P( xi )
i =1
X E[ X ] = xi P ( xi ) = m
i
P ( xi X xi + x) P ( xi )
is given approximately by
P ( xi ) = f ( xi )x
So we have
m = xi f ( xi )x
i
In the limit, x 0,
m=
xf ( x)dx
Therefore
E[ X ] =
xf
( x)dx
20
x = E[ X ] = xf X ( x)dx
given by:
E[Y ] =
yf
( y )dy
or
E[ g ( X )] =
g ( x) f
( x)dx
Example:
22
E[ g ( X , Y )] = g ( x, y ) f ( x, y )
x
discrete case
E[ g ( X , Y )] =
g ( x, y) f ( x, y)dxdy
continuous
case
23
E[ Z ] = Z =
xy f
XY
( x, y ) dxdy
Z=
xy f
( x) fY ( y ) dxdy
x f
( x)dx y fY ( y ) dy = XY = mx m y
24
Properties of Expectation:
1. If c is any constant, then E(cX) = c E(X)
2. If X and Y are any RVs, then E(X+Y) = E(X) + E(Y)
3. If X and Y are independent RVs, then E(XY) = E(X) E(Y)
25
Moments:
If the function g(X) is X raised to a power, i.e., g(X) = Xn, the
average value E[Xn] is referred to as the nth moment of the
random variable
X : First moment of X
E[ X n ] =
n
x
f X ( x ) dx
E[ X 2 ] =
2
x
f X ( x ) dx
26
Central Moment:
Moments of the difference between a random variable
and its mean X
nth central moment is
28
Properties of Variance:
1.
2 = E[( X ) 2 ] = E ( X 2 ) 2
2.
31
f ( x) =
1
2 2
( x ) 2 / 2 2
32
X=
xe
( x ) 2 / 2 2
2 2
E[( X ) ] =
2
dx =
( x m) e
( x m ) 2 / 2 2
2 2
dx = 2
f ( x)dx = 1
33
Joint Moments
The joint moment for a pair of RVs X and Y is defined by:
or
34
35
Transformations of RVs
Question: How to determine the pdf of a RV Y related to another
RV X by the transformation Y = g(X)
36
Monotone Transformations
Let X be a RV with pdf fX(x) and let Y = g(X) be a monotone
differential function of X
Both the events (y < Y y+dy) and (x < X x+dx) contain the
same outcomes
The probabilities of these two events must be equal
P(y < Y y+dy) = P(x < X x+dx)
fY(y) dy = fX(x) dx if g(x) is a monotone increasing function
fY(y) dy = - fX(x) dx if g(x) is a monotone decreasing function
fY(y) |dy| = fX(x) |dx|
38
Many-to-one Transformations
Let the equation g(x) = y has three roots x1, x2 and x3
40
41
42