Lecture Notes 1 36-705 Brief Review of Basic Probability
Lecture Notes 1 36-705 Brief Review of Basic Probability
36-705
Brief Review of Basic Probability
I assume you already know basic probability. Chapters 1-3 are a review. I will assume
you have read and understood Chapters 1-3. If not, you should be in 36-700.
Random Variables
Let be a sample space (a set of possible outcomes) with a probability distribution (also
called a probability measure) P . A random variable is a map X : R. We write
P (X A) = P ({ : X() A})
and we write X P to mean that X has distribution P . The cumulative distribution
function (cdf ) of X is
FX (x) = F (x) = P (X x).
A cdf has three properties:
1. F is right-continuous. At each x, F (x) = limn F (yn ) = F (x) for any sequence
yn x with yn > x.
2. F is non-decreasing. If x < y then F (x) F (y).
3. F is normalized. limx F (x) = 0 and limx F (x) = 1.
Conversely, any F satisfying these three properties is a cdf for some random variable.
If X is discrete, its probability mass function (pmf ) is
pX (x) = p(x) = P (X = x).
If X is continuous, then its probability density function function (pdf ) satisfies
Z
Z
p(x)dx
pX (x)dx =
P (X A) =
A
X F,
X p.
Suppose that X P and Y Q. We say that X and Y have the same distribution
if P (X A) = Q(Y A) for all A. In that case we say that X and Y are equal in
d
distribution and we write X = Y .
Expected Values
Z
g(x)dF (x) =
( R
g(x)p(x)dx if X is continuous
Recall that:
P
P
1. E( kj=1 cj gj (X)) = kj=1 cj E(gj (X)).
2. If X1 , . . . , Xn are independent then
E
n
Y
!
Xi
i=1
E (Xi ) .
n
X
!
ai X i
i=1
7. The covariance is
Cov(X, Y ) = E((X x )(Y y )) = E(XY ) X Y
and the correlation is (X, Y ) = Cov(X, Y )/x y . Recall that 1 (X, Y ) 1.
The conditional expectation of Y given X is the random variable E(Y |X) whose
value, when X = x is
Z
E(Y |X = x) = y p(y|x)dy
where p(y|x) = p(x, y)/p(x).
(n)
Transformations
pX (x)dx
A(y)
where
Ay = {x : g(x) y}.
The density is pY (y) = FY0 (y). If g is monotonic, then
dh(y)
pY (y) = pX (h(y))
dy
where h = g 1 .
Example 2 Let pX (x) = ex for x > 0. Hence FX (x) = 1 ex . Let Y = g(X) = log X.
Then
FY (y) = P (Y y) = P (log(X) y)
y
= P (X ey ) = FX (ey ) = 1 ee
y
Independence
n
Y
P(Xi Ai ).
i=1
Qn
or
X1 , . . . , X n F
or
Important Distributions
Normal (Gaussian). X N (, 2 ) if
1
2
2
p(x) = e(x) /(2 ) .
2
4
X1 , . . . , Xn p.
If X Rd then X N (, ) if
1
T 1
exp (x ) (x ) .
p(x) =
(2)d/2 ||
2
P
Chi-squared. X 2p if X = pj=1 Zj2 where Z1 , . . . , Zp N (0, 1).
1
x = 0, 1.
Binomial. X Binomial() if
n x
p(x) = P(X = x) =
(1 )nx
x
x {0, . . . , n}.
R
0
1
x1 ex/
()
1 1 x/
x e
dx.
Remark: In all of the above, make sure you understand the distinction between random
variables and parameters.
(1)
2
Theorem 9 Let Y N (, ). Then:
(a). Y T 1 Y 2n (T 1 ).
(b). (Y )T 1 (Y ) 2n (0).
1X
Xi
n i
1 X
(Xi X)2 .
n1 i
Theorem 10 If X1 , . . . , Xn N (, 2 ) then
2
(a) X n N (, n ).
(b)
2
(n1)Sn
2
2n1 .