Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

AI5030 Homework 08

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

AI 5030: PROBABILITY AND STOCHASTIC PROCESSES

INSTRUCTOR: DR. KARTHIK P. N.

HOMEWORK 8
TOPICS: EXPECTATIONS OF DISCRETE AND CONTINUOUS RANDOM भारतीय ूौ ो गक संस्थान हैदराबाद
Indian Institute of Technology Hyderabad

VARIABLES, VARIANCE, COVARIANCE


Fix a probability space (Ω, F , P). All random variables appearing below are assumed to be defined with respect to F .
1. For any x ∈ R, let ⌊x⌋ denote the largest integer lesser than or equal to x. Thus, for instance, ⌊3.5⌋ = 3, ⌊−8.9⌋ = −9,
⌊2⌋ = 2, and so on.
Suppose that X ∼ Exponential(1). Determine the expected value of Y = ⌊X⌋.
2. Let X be a non‑negative and continuous random variable with PDF fX and CDF FX . Show that
∫ ∞ ∫ ∞
E[X] = P({X > x}) dx = (1 − FX (x)) dx,
0 0

where the above integrals are usual Riemann integrals.


Hint: Write down the formula for expectation in terms of the PDF, and apply change of order of integration.
3. Suppose that X and Y are jointly discrete random variables. The random variable X takes values in {−1, 0, 1} with
uniform probabilities. Suppose that for each x ∈ {−1, 0, 1},
1
pY |X=x (y) = 1{|y−x|=1} , y ∈ R.
2
Compute E[Y ].
i.i.d.
4. Let X1 , X2 , . . . ∼ Exponential(λ), and let
[∑ N ∼ Geometric(p)
] be independent of {X1 , X2 , . . .}. Here, λ > 0 and
N
p ∈ (0, 1) are fixed constants. Compute E i=1 Xi .

5. (a) Let X and Y be jointly continuous with the joint PDF


{ xy
cx2 + , 0 ≤ x ≤ 1, 0 ≤ y ≤ 2,
fX,Y (x, y) = 3
0, otherwise.
i. Find the constant c.
ii. Are X and Y independent?
iii. Calculate Cov(X, Y ).
(b) Let X and Y be independent random variables distributed uniformly on [0, 1].
Let U = min{X, Y } and V = max{X, Y }. Calculate Cov(U, V ).
6. Let X ∼ N (0, 1). Let W be a discrete random variable independent of X and having the PMF
{
1
, w = ±1,
P({W = w}) = 2
0, otherwise.

Define a new random variable Y as Y = W X.


(a) Show that Y ∼ N (0, 1).
(b) Show that X and Y are uncorrelated, but not independent.
(c) A friend of yours comes to you and claims that Z = X +Y is Gaussian distributed. Is your friend’s claim correct?
7. Fix n ∈ N, n ≥ 2.
Let X1 , X2 , . . . , Xn be independent and identically distributed with finite mean µ and variance σ 2 .
Define the sample mean Mn and sample variance Vn as the random variables
1∑ 1 ∑
n n
Mn := Xi , Vn := (Xi − Mn )2 .
n i=1 n − 1 i=1

HOMEWORK 8 AI 5030: PROBABILITY AND STOCHASTIC PROCESSES Page 1


(a) Show that E[Mn ] = µ.
(b) Show that E[Vn ] = σ 2 (the factor (n − 1) in the denominator of Vn is precisely to ensure that the mean of Vn
is equal to σ 2 ).
σ2
(c) Show that Var(Mn ) = n .

8. Suppose that X, Y , and Z are three random variables defined with respect to F .
Let the means of Y and Z be µY and µZ respectively. Show that
({ })
[ ]
E[max{X, µY } − max{X, µZ }] ≤ |µY − µZ | · P X ∈ min{µY , µZ }, max{µY , µZ } .

Hint: Consider the cases µY < µZ and µY ≥ µZ separately.


For each case, break down the sample space into events of the form {X < µY }, {µY ≤ X ≤ µZ }, {X > µZ }.
On each of these events, upper bound the mean value of max{X, µY } − max{X, µZ }.

HOMEWORK 8 AI 5030: PROBABILITY AND STOCHASTIC PROCESSES Page 2

You might also like