Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
2 views

Tutorial2

This document contains a tutorial for a statistics course covering various topics including transformations of random variables, moments, and properties of different probability distributions such as lognormal, beta, gamma, and normal distributions. It includes exercises on finding densities, means, variances, and expectations, as well as simulations and integrals related to these distributions. The tutorial emphasizes theoretical understanding and practical applications in statistical analysis.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Tutorial2

This document contains a tutorial for a statistics course covering various topics including transformations of random variables, moments, and properties of different probability distributions such as lognormal, beta, gamma, and normal distributions. It includes exercises on finding densities, means, variances, and expectations, as well as simulations and integrals related to these distributions. The tutorial emphasizes theoretical understanding and practical applications in statistical analysis.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

1

Department of Statistical Sciences

STA2004F Tutorial 2 March 3, 2025

Transformations; Moments

1. (Lognormal distribution) Let X ∼ N (µ, σ 2 ) and Z = eX .

(a) Find the density of Z.


(b) Find the mean and variance of Z in terms of µ and σ 2 .

2. Let X ∼ Beta(r, s). Show that


r rs
E(X) = and Var(X) = 2
.
r+s (r + s) (r + s + 1)

3. Let X ∼ Gamma(α, λ).

(a) Find the density of Y = kX, where k > 0 is a constant.


(b) Find k such that Y has a χ2 distribution.

4. The random variable X has the following density


5
fX (x) = I[1,∞) (x), x ∈ R.
x6
(a) Explain how you would simulate realisations from X when supplied with uniform ran-
dom numbers.
(b) Show that Y = ln X has an exponential distribution.

5. Show that if X ∼ U (a, b), then Y = cX+d (c, d are constants) also has a uniform distribution
and find its parameters.

6. Show that if X is a non-negative discrete random variable taking values in N, then



X
E(X) = P(X > n).
n=0

7. In class we showed that ∞ √


Z
2
e−x dx = π.
−∞

Use this to show that



 
1
Γ = π.
2
Also show that the normal distribution pdf does indeed integrate to 1.

8. Let X ∼ N (0, 1) and Y := |X|. Calculate E(Y ) in three ways:

(a) By finding the survival function of Y and using Darth Vader.


(b) By finding the density of Y , fY , and integrating yfY (y).
(c) By noting that Y = g(X) where g(x) = |x| and integrating g(x)fX (x) over R.

All methods should (of course!) give the same answer.


2

9. In this question we show that


Γ(a)Γ(b)
B(a, b) = .
Γ(a + b)
(a) First show that Z ∞ Z ∞
Γ(a)Γ(b) = ta−1 sb−1 e−(s+t) dsdt.
0 0

(b) Using the following change of variables,

t = xy, s = x(1 − y),

show that
Γ(a)Γ(b) = Γ(a + b)B(a, b).

10. (Continuous mixture) Let f (x|λ) be the exponential density with parameter λ > 0. Further
assume that λ is also random and has a density function g(λ). Show that the function
Z ∞
h(x) := f (x|λ)g(λ) dλ
0

is also a density. Generalize this result to any density f .

11. Let X ∈ L2 and define the function h : R → R by

h(a) := E((X − a)2 ), a ∈ R.

Show that this function has a global minimum at a = E(X) and the minimum value of h is
Var(X).
2
12. Let X ∼ N (µX , σX ) and Y ∼ N (0, 1).

(a) Show that


X − µX
Z := ∼ N (0, 1).
σX
(b) Show that (
r!
r
r for r = 0, 2, 4, 6, 8, . . .
E(Y ) = 2 2 (r/2)!
0 for r = 1, 3, 5, 7, 9, . . . .

(c) Hence show that


( σXr r!
r for r = 0, 2, 4, 6, 8, . . .
E((X − µX )r ) = 2 2 (r/2)!
0 for r = 1, 3, 5, 7, 9, . . . .

(d) Hence find the skewness and kurtosis of X and show that both these quantities are
independent of µX and σX .

You might also like