Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
2 views

Probability Assignment 3

This document outlines Assignment 3 for the MA6.102 Probability and Random Processes course, detailing instructions, submission requirements, and seven problems to solve. The problems cover various topics in probability, including Bernoulli random variables, geometric random variables, Cauchy-Schwarz inequality, conditional expectations, and simulations related to coin tosses. The assignment is due on 14 September 2024, with specific guidelines on individual work and coding requirements.

Uploaded by

achutunisr
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Probability Assignment 3

This document outlines Assignment 3 for the MA6.102 Probability and Random Processes course, detailing instructions, submission requirements, and seven problems to solve. The problems cover various topics in probability, including Bernoulli random variables, geometric random variables, Cauchy-Schwarz inequality, conditional expectations, and simulations related to coin tosses. The assignment is due on 14 September 2024, with specific guidelines on individual work and coding requirements.

Uploaded by

achutunisr
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Assignment 3

(MA6.102) Probability and Random Processes, Monsoon 2024


Release date: 7 September 2024, Due date: 14 September 2024

I NSTRUCTIONS
• Discussions with other students are not discouraged. However, all write-ups must be done individually
with your own solutions.
• Any plagiarism when caught will be heavily penalised.
• Be clear and precise in your writing.
• Coding portion can be done in Python/Matlab. There will be a moss check, code copying will result
in a straight 0.
• Submit a zipped folder (rollnumber.zip) containing your handwritten solutions (PDF), code and PDF
of plots.

Problem 1. Let X1 , X2 , . . . , Xn be independent random variables and let X = X1 + X2 + · · · + Xn .


Suppose that each Xi is a Bernoulli random variable with parameter pi , and that p1 , p2 , . . . , pn are chosen
so that the mean of X is a given µ > 0. Show that the variance of X is maximized if the pi values are
chosen to be all equal to nµ .

Problem 2. Prove the following.


If X is a positive integer valued random variable satisfying
P (X > m + n|X > m) = P (X > n)
for any two positive integers m and n, then X is a geometric random variable.

Problem 3. (a) Give examples of two discrete random variables that are uncorrelated but not independent.
(b) Give examples of two discrete random variables where uncorrelatedness guarantees independence.

Problem 4. For any two random variables X and Y , Cauchy-Schwarz inequality states that
p
|E[XY ]| ≤ E[X 2 ]E[Y 2 ]
with equality if and only if X = αY , for some constant α ∈ R. Prove this, and use it to show that
|ρ(X, Y )| ≤ 1, where ρ(X, Y ) is the correlation coefficient of X and Y given by
E[XY ] − E[X]E[Y ]
ρ(X, Y ) = p .
var(X)var(Y )
[Hint: Observe that E[(X − αY )2 ] ≥ 0, for all α ∈ R]

Problem 5. Let ϕ(Y ) = E[X|Y ]. For any function g : R → R, show that


E[ϕ(Y )g(Y )] = E[Xg(Y )].
Argue that the law of iterated expectations, ie., E [E [X|Y ]] = E[X], is a special case of this.
Problem 6. (a) For any discrete random variable X and any event A such that P (A) > 0, show that
E[1A X]
E[X|A] = ,
P (A)
where 1A is the indicator random variable of event A.
(b) X denotes the sum of outcomes obtained by rolling a die twice and Ai is the event that the first
die shows i, for i ∈ [1 : 6]. Compute E[X|Ai ], for i ∈ [1 : 6].

Problem 7. You toss a fair coin 100 times. After each toss, either there have been more heads, more
tails, or the same number of heads and tails. Let X be the number of times in the 100 tosses that there
were more heads than tails. Estimate the PMF of X via simulation and plot it. Show that the most likely
number of times you have more heads than tails when a coin is tossed 100 times is zero.
Now, once you have shown that the most likely number of times you have more heads than tails when
a coin is tossed 100 times is zero, suppose you toss a coin 100 times.
(a) Let Y be the number of times in the 100 tosses that you have exactly the same number of heads
as tails. Estimate the expected value of Y .
(b) Let Z be the number of tosses for which you have more heads than tails. Estimate the expected
value of Z.

You might also like