Lecture 11
Lecture 11
Lecture 11
Solution.
n
X
EX = kfX (k)
k=0
n
X
= kfX (k)
k=1
n
X n!
= pk (1 − p)n−k
k=1
(k − 1)!(n − k)!
n
X (n − 1)!
= np pk−1 (1 − p)n−k
k=1
(k − 1)!(n − k)!
n−1
X n−1 `
= np p (1 − p)n−1−`
`=0
`
= np(p + (1 − p))n−1
= np.
Remark. Although we define the expectations separately for discrete and continuous random
variables, they can be unified by the following:
Z ∞
EX = x dFX (x),
−∞
where the above integral is a Riemann–Stieltjes integral. This makes sense even if X is
neither discrete nor continuous.
Example 2. Suppose that X ∼ Unif[a, b]. Then
Z ∞ Z b
1 b 2 − a2 a+b
EX = xfX (x) dx = x dx = = .
−∞ a b−a 2(b − a) 2
Example 3. Suppose that X ∼ Exp(λ). Find EX.
Solution.
Z ∞
EX = xfX (x) dx
Z−∞
∞
= xλe−λx dx
0
Z ∞
= −xe−λx |∞
0 + e−λx dx
0
∞
1 1
= − e−λx = .
λ 0 λ
Solution. We have P(X = 2n ) = 2−n for all positive integers n, because X = 2n means the
first (n − 1) flips are tails and the last one is heads. So
∞
X ∞
X
n −n
EX = 2 2 = 1 = ∞.
n=1 n=1
Remark. Recall that this game will end eventually with probability 1. In particular, with
probability 1, X is finite. However, EX is infinite. This example is also known as the
St. Petersburg paradox.
Find EX.
Solution.
Z ∞ Z ∞ Z ∞
−2
EX = xfX (x) dx = x·x dx = x−1 dx = log x|∞
1 = ∞.
−∞ 1 1
In the next examples we cannot give the expectation any value whatsoever, finite or
infinite.
2
Example 6. A fair coin is flipped until we see the first heads. Let n denote the number of
flips needed. If n is odd, you pay me 2n dollars. If n is even, I pay you 2n dollars. Let X
denote my net reward. Can I calculate my expected net reward?
Solution. We have
P(X = 2n ) = 2−n for odd n ≥ 1 and P(X = −2n ) = 2−n for even n ≥ 1.
What is EW ?
If X denote the outcome of the die roll, then W = g(X), where g(1) = g(2) = g(3) = −1,
g(4) = 1, g(5) = g(6) = 2.
Solution.
1 1 1 1
EW = −1 · P(W = −1) + 1 · P(W = 1) + 2 · P(W = 2) = (−1) · +1· +2· = .
2 6 3 3
There is an alternative way to think about this expectation. We can compute P(W = w)
by finding all the values of x for which g(x) = w and adding up the probabilities:
3
+ g(4)P(X = 4) + g(5)P(X = 5) + g(6)P(X = 6)
1
= .
3
We see that
6
X
Eg(X) = g(x)P(X = x).
x=1
For continuous random variables, we cannot repeat the same computations (since fX (x)
does not represent probability), but the same is also true.
Proposition 0.4. If X is a continuous random variable and g : R → R, then
Z ∞
Eg(X) = g(x)fX (x) dx.
−∞
Proof. We have
Z
EY = y dFY (y)
R
4
Z Z y
= dt dFY (y)
ZR Z0 ∞
= I{t<y} dt dFY (y)
ZR∞ 0Z
= I{y>t} dFY (y) dt
0 R
Z ∞Z ∞
= dFY (y) dt
0 t
Z ∞
= P(Y > t) dt.
0
Using the same idea, one can show that for general random variable Y such that EY is
defined, one has Z ∞ Z ∞
EY = P(Y > t) dt − P(Y < −t) dt.
0 0
We now prove the proposition.
Proof. First,
Z ∞ Z ∞
Eg(X) = P(g(X) > t) dt − P(g(X) < −t) dt.
0 0
Note that
Z ∞ Z ∞ Z
P(g(X) > t) dt = fX (x) dx dt
0 0 {x:g(x)>t}
Z Z g(x)
= fX (x) dt dx
{x:g(x)>0} 0
Z
= g(x)fX (x) dx.
{x:g(x)>0}
Similarly, Z ∞ Z
P(g(X) < −t) dt = − g(x)fX (x) dx.
0 {x:g(x)<0}
Combining, we obtain the desired result.
Example 8. A stick of length L is broken at a uniformly chosen random location. What is
the expected length of the longer piece?
Solution. Let the interval [0, L] represent the stick and let X ∼ Unif[0, L] be the position
where the stick is broken. Let g(x) denote the length of the longer piece:
(
L − x if 0 ≤ x ≤ L/2,
g(x) =
x if L/2 ≤ x ≤ L.
Then
∞ L/2 L
L−x
Z Z Z
x 3L
Eg(X) = g(x)fX (x) dx = dx + dx = .
−∞ 0 L L/2 L 4
5
0.3 Variance
Definition 0.6. Let X be a random variable with mean µ. The variance of X is defined by