36-401 Modern Regression Homework #1 Solutions: DUE: September 8, 2017 Problem 1 (20 PTS.)
36-401 Modern Regression Homework #1 Solutions: DUE: September 8, 2017 Problem 1 (20 PTS.)
36-401 Modern Regression Homework #1 Solutions: DUE: September 8, 2017 Problem 1 (20 PTS.)
Now assume
⎡ k ⎤ k
⎢ ⎥
E⎢ ∑ aj Xj ⎥⎥ = ∑ aj E[Xj ]
⎢
⎢ j=1 ⎥ j=1
⎣ ⎦
holds for some k ∈ Z+ , and define
k
Y ∶= ∑ aj Xj .
j=1
Then
⎡ k+1 ⎤
⎢ ⎥
⎢
E⎢ ∑ aj Xj ⎥⎥ = E[Y + ak+1 Xk+1 ]
⎢ j=1 ⎥
⎣ ⎦
= E[Y ] + ak+1 E[Xk+1 ] (6)
k+1
= ∑ aj E[Xj ],
j=1
1
36-401 Modern Regression: Homework 1
∞ ∞ ∞
∑ P (X ≥ j) = ∑ ∑ P (X = k) (7)
j=1 j=1 k=j
∞ k
= ∑ ∑ P (X = k) (8)
k=1 j=1
∞
= ∑ k ⋅ P (X = k)
k=1
= E[X]
To understand what is happening with the interchange of summations in (8) it may help to
write (7) as
∞ ∞
∑ ∑ P (X = k) = P (X = 1) + P (X = 2) +P (X = 3) + P (X = 4) +P (X = 5) + P (X = 6) + ⋯
j=1 k=j
+ P (X = 2) +P (X = 3) + P (X = 4) +P (X = 5) + P (X = 6) + ⋯
+P (X = 3) + P (X = 4) +P (X = 5) + P (X = 6) + ⋯
+ P (X = 4) +P (X = 5) + P (X = 6) + ⋯
+P (X = 5) + P (X = 6) + ⋯
+ P (X = 6) + ⋯
⋮
(7) sums over this “matrix” by row and (8) sums over it by column.
Alternate approach
One could also sum over all entries of the above matrix with zeros plugged into the lower
triangle, i.e.
∞ ∞ ∞
∑ P (X ≥ j) = ∑ ∑ P (X = k) ⋅ 1{k≥j}
j=1 j=1 k=1
∞ ∞
= ∑ ∑ P (X = k) ⋅ 1{k≥j}
k=1 j=1
∞
= ∑ k ⋅ P (X = k)
k=1
= E[X].
2
36-401 Modern Regression: Homework 1
= (a − E[X]) ⋅ 1
2
2
= (a − ∫ x ⋅ p(x)dx)
R
= (a − a ⋅ p(a))2
= (a − a)2
= 0.
for all x.
For (10) to hold, then any time p(x) > 0, we must have (x − E[X])2 = 0.
Now assume there are two distinct values x1 and x2 for which p(x1 ) > 0 and p(x2 ) > 0.
But (10) implies
x1 = x2 = E[X],
a contradiction. Therefore,
P (X = a) = 1,
where a = E[X].
3
36-401 Modern Regression: Homework 1
4
36-401 Modern Regression: Homework 1
(b)
(c)
E[Y ∣ X = x] = E[5X + ∣ X = x]
= E[5X ∣ X = x] + E[ ∣ X = x] (by linearity of conditional expectation)
= 5x + E[ ∣ X = x]
= 5x + E[] (by independence of X and )
= 5x
(d)
5
36-401 Modern Regression: Homework 1
(e)
Although Cov(, 2 ) = 0, this does not imply that they are independent! For example,
consider that
P ( ≤ 1, 2 ≤ 1) = P ( ≤ 1)
≠ P ( ≤ 1) ⋅ P (2 ≤ 1).
6
36-401 Modern Regression: Homework 1
Appendix
(A1) Let X and Y be independent random variables. Then, by definition, for any (x, y)
P (X ≤ x, Y ≤ y) = P (X ≤ x) ⋅ P (Y ≤ y). (11)