Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
34 views3 pages

Stat 581 Midterm Exam Solutions (Autumn 2011)

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 3

Stat 581 Midterm Exam Solutions (Autumn 2011)

1. (5pts) Let δ be the UMVUE of g(P ) for a model P1 , where P0 ⊂ P1 ⊂ P2 . Explain


whether or not δ is necessarily unbiased or the UMVUE for P0 and/or P2 .
Solution: If P ∈ P0 ⊂ P1 then P is in P1 , so unbiasedness for P1 implies EP [δ] =
g(P ) ∀P ∈ P0 . Alternatively, let U1 be the set of estimators that are unbiased for g(P )
for all P ∈ P1 , and let U0 be the unbiased estimators under P0 . Then U1 ⊂ U0 so if
something is unbiased for P1 (i.e. in U1 ) then it is unbiased for P0 (in U0 ). However,
the UMVUE for P0 may lie in U0 but not in U1 .
An estimator that is unbiased for P1 is not necessarily unbiased for P2 , and it follows
that δ is not necessarily UMVUE for P2 either.

2. (10pts) Let δ1 and δ2 have finite variances for each θ ∈ Θ and suppose they are the
UMVUEs for g1 (θ) and g2 (θ), respectively. Show that a1 δ1 + a2 δ2 is the UMVUE for
a1 g1 (θ) + a2 g2 (θ), where a1 and a2 are known.
Solution: Recall that an estimator with finite variance is UMVUE for its expectation
iff it is uncorrelated with every unbiased estimator of zero (LC Theorem 2.1.7). The
estimator δ = a1 δ1 + a2 δ2 has finite variance, and for an unbiased estimator U of zero
we have

Cov[δ, U ] = E[δU ]
= a1 E[δ1 U ] + a2 E[δ2 U ]
= 0

since both δ1 and δ2 are UMVUE for their expectations and so uncorrelated with U .
Several people tried to solve this problem with the variance inequality on δ and then on
another estimator δ 0 = a1 δ10 + a2 δ20 . With this approach, you can show that the bounds
for δ are lower than the bounds for δ 0 based purely on δ1 and δ2 being UMVUEs.
However, this approach will not work for two reasons: An ordering on the bounds
doesn’t imply the variances themselves are ordered, and we need to consider estimators
δ 0 that are not linear combinations of estimators of g1 (θ) and g2 (θ).
A few people argued that since δ1 and δ2 are UMVUE, they must be functions of a
complete sufficient statistic. Then δ will be a function of the c.s.s., and is unbiased for
a1 g(θ) + a2 g(θ). This reasoning is flawed because there exist UMVUEs in situations

1
where there is not a c.s.s. (see L. Bondesson (1983), “On Uniformly Minimum Variance
Unbiased Estimation when no Complete Sufficient Statistics Exist”). If there is a c.s.s.,
then a UMVUE will be a function of it, but a UMVUE can exist without a c.s.s..

3. (15pts) Let X1 , . . . , Xn ∼ be i.i.d. with E[|Xi |] < ∞. Show formally that the sample
mean X̄ is a version of E[X1 |X(1) , . . . , X(n) ].
R R
Solution: For this problem you need to show that A
X̄ dP = A
X1 dP for every
A ∈ σ(X(1) , . . . , X(n) ). Do this by starting with X̄:
Z n Z
1X
X̄ dP = Xi dP
A n i=1 A
n
1X
= E[Xi 1A ]
n i=1
n
1X
= E[X1 1A ] because the Xi s are i.i.d.
n i=1
Z
= E[X1 1A ] = X1 dP.
A

You need to be careful about the second to last line, as it only generally holds for
A ∈ σ(X(1) , . . . , X(n) ). For A outside of this σ-algebra, E[X1 1A ] is not necessarily
the same as E[Xi 1A ] for arbitrary sets A even if the Xi ’s are i.i.d. For example if
A = {X2 < 3} then the result doesn’t hold.

4. (20pts) Let X1 , X2 be i.i.d. f (x|θ) where


(
3x2 /θ3 if 0 < x < θ,
f (x|θ) =
0 otherwise.

(a) For θ ∈ Θ = R+ , find a complete sufficient statistic for θ based on X1 and X2 .


Solution: The factorization criteria shows that T = X(2) = max{X1 , X2 } is
sufficient, and Lehmann and Scheffe’s theorem shows it is minimal. To show that
it is complete, suppose Eθ [g(T )] for all θ > 0, i.e.
Z θ
g(t)p(t|θ) dt = 0 ∀ θ > 0,
0

2
where p(t|θ) = 6t5 /θ6 is the density of T . Letting g+ and g− be the positive and
negative parts of g = g+ − g− , we have
Z θ Z θ
5
g+ (t)t dt = g− (t)t5 ∀θ > 0.
0 0

It was sufficient for the exam to make reference to the result in Lehmann (or
my supplementary notes) that this implies g+ = g− a.e., so that g = 0 a.e..
Alternatively, you can take derivatives:
Z θ
d d
g(t)t5 dt = 0
dθ 0 dt
g(θ)θ5 = 0
g(θ) = 0 for almost all θ > 0 ⇒ g = 0 a.e.

Thus Eθ [g(T )] = 0 for all θ implies g = 0 a.e., so T is complete.


(b) Find the UMVUE estimator for θ.
Solution: The expectation of T is 6θ/7, so the UMVUE is 7T /6.
(c) Determine if the UMVUE is admissible for estimating θ with squared error loss
(Hint: Consider the class of scale multiples of the UMVUE).
Solution: Considering the class of estimators cT , the minimum risk (variance
plus squared bias) is attained when c = 8/7, so 7T /6 is not admissible even in
this specific class.

You might also like