1 What Is A Random Variable (R.V.) ?
1 What Is A Random Variable (R.V.) ?
1 What Is A Random Variable (R.V.) ?
pX (x) = P ({X = x}) = P (all possible outcomes that result in the event {X = x}) (1)
• Everything that we learnt in Chap 1 for events applies. Let Ω is the sample space (space of
all possible values of X in an experiment). Applying the axioms,
– pX (x) ≥ 0
X
– P ({X ∈ S}) = pX (x) (follows from Additivity since different events {X = x} are
x∈S
disjoint)
X
– pX (x) = 1 (follows from Additivity and Normalization).
x∈Ω
2
X
– Example: X = number of heads in 2 fair coin tosses (p = 1/2). P (X > 0) = pX (x) =
x=1
0.75.
• Can also define a binary r.v. for any event A as: X = 1 if A occurs and X = 0 otherwise.
Then X is a Bernoulli r.v. with p = P (A).
• Geometric r.v., X, with probability of heads p (X= number of coin tosses needed for a head
to come up for the first time or number of independent trials needed to achieve the first
“success”).
1
– Example: I keep taking a test until I pass it. Probability of passing the test in the xth
try is pX (x).
– Easy to see that
• Poisson r.v. X with expected number of arrivals Λ (e.g. if X = number of arrivals in time τ
with arrival rate λ, then Λ = λτ )
e−Λ (Λ)x
P oisson(Λ) : pX (x) = , x = 0, 1, . . . ∞ (5)
x!
• Uniform(a,b):
(
1/(b − a + 1), if x = a, a + 1, . . . b
pX (x) = (6)
0, otherwise
• pmf of Y = g(X)
X
– pY (y) = P ({Y = y}) = pX (x)
x|g(x)=y
Example Y = |X|. Then pY (y) = pX (y) + pX (−y), if y > 0 and pY (0) = pX (0).
Exercise: X ∼ U nif orm(−4, 4) and Y = |X|, find pY (y).
• Application: Decision making using expected values. Example 2.8 (Quiz game, compute
expected reward with two different strategies to decide which is a better strategy).
• Binomial(n, p) becomes P oisson(np) if time interval between two coin tosses becomes very
small (so that n becomes very large and p becomes very small, but Λ = np is finite). **
2
3 Multiple Discrete Random Variables: Topics
• Joint PMF, Marginal PMF of 2 and or more than 2 r.v.’s
• Bayes rule
• Independence
pX,Y (x, y) , P (X = x, Y = y)
– Let A be the set of all values of x, y that satisfy a certain property, then
P
P ((X, Y ) ∈ A) = (x,y)∈A pX,Y (x, y)
– e.g. X = outcome of first die toss, Y is outcome of second die toss, A = sum of outcomes
of the two tosses is even.
• Marginal PMF is another term for the PMF of a single r.v. obtained by “marginalizing”
the joint PMF over the other r.v., i.e. the marginal PMF of X, pX (x) can be computed as
follows:
Apply Total Probability Theorem to pX,Y (x, y), i.e. sum over {Y = y} for different values y
(these are a set of disjoint events whose union is the sample space):
X
pX (x) = pX,Y (x, y)
y
– Read the above as pZ (z) = P (Z = z) = P (all values of (X, Y ) for which g(X, Y ) = z)
3
• Expected value of functions of multiple r.v.’s
If Z = g(X, Y ), X
E[Z] = g(x, y)pX,Y (x, y)
(x,y)
P ({X = x} ∩ A)
pX|A (x) , P ({X = x}|A) =
P (A)
P
– pX|A (x) is a legitimate PMF, i.e. x pX|A (x) = 1. Exercise: Show this
– Example 2.12, 2.13
The above holds for all y for which py (y) > 0. The above is equivalent to
4
• Bayes rule. How to compute pX|Y (x|y) using pX (x) and pY |X (y|x),
pX,Y (x, y)
pX|Y (x|y) =
pY (y)
pY |X (y|x)pX (x)
= P ′ ′
x′ pY |X (y|x )pX (x )
6 Independence
• Independence of a r.v. & an event A. r.v. X is independent of A with P (A) > 0, iff
5
• Independence of 2 r.v.’s. R.v.’s X and Y are independent iff
pX|Y (x|y) = pX (x), for all x and for all y for which pY (y) > 0
pY |X (y|x) = pY (y), for all y and for all x for which pX (x) > 0
• If X1 , X2 , . . . Xn are independent,
pX1 ,X2 ,...Xn (x1 , x2 , . . . xn ) = pX1 (x1 )pX2 (x2 ) . . . pXn (xn )