Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

STAT422 - Slides

Download as pdf or txt
Download as pdf or txt
You are on page 1of 52

STAT 422: Applied Probability Lecture Notes

Fastel Chipepa1

Botswana International University of Science and Technology1

February 7, 2024

F. Chipepa February 7, 2024 1 / 51


Outline

1 CHAPTER 1
1.1: Sample Space and Events
1.2: Probabilities Defined on Events
1.3: Conditional Probability
1.4: Independent Events
1.5: Bayes’ Formula

2 CHAPTER 2
2.1: Random Variables
2.2: Discrete Random Variables
2.3: Continuous Random Variables
2.4: Expectation of a Random Variable
2.5: Jointly Distributed Random Variables

F. Chipepa February 7, 2024 2 / 51


CHAPTER 1 1.1: Sample Space and Events

Sample Space and Event

Sample Space (S) - is the set of all possible outcomes


e.g If you flip a coin S={H, T}
If you roll a die S={1, 2, 3, 4, 5, 6} .
The set(a, b) is defined to consist of all points x such that a x b, [a,
b] consist of all points x such that a ¤ x ¤ b, (a, b] consist of all points x
such that a x ¤ b and [a, b) consist of all points x such that a ¤ x b.
Event - is any subset of the sample space.
For any two events E and F of a sample space S, we define the new event
E Y F to consist of all outcomes that are either in E or F or in both E
and F.
E X F consist of all outcomes which are both in E and F. If E X F  H,
then E and F are mutually exclusive.

F. Chipepa February 7, 2024 3 / 51


CHAPTER 1 1.1: Sample Space and Events

”8
n1 En , is defined to be the event that consists of all outcomes that are
in En for at least one value of n  1, 2, 3, ....
“8
n1 En , is defined to be the event consisting of those outcomes that are
in all of the events En , n  1, 2, 3, ....
For an event E, E c (compliment of E) consists of all outcomes in the
sample space S that are not in E.

F. Chipepa February 7, 2024 4 / 51


CHAPTER 1 1.2: Probabilities Defined on Events

Probabilities Defined on Events

Given a sample space S, for each event E, we assume that a number P(E)
is defined and satisfies the following 3 conditions
i 0 ¤ P pE q ¤ 1
ii P pS q  1
iii For any sequence of events E1 , E2 , ..., that are mutually exclusive, then
pp 8
°8
n1 En q  n1 P pEn q, where P(E) is the probability of event E.
”

Since the events E and E c are always mutually exclusive and since
E Y E c  S, we have by (ii) and (iii) that
1  P pS q  P pE Y E c q  P pE q P pE c q or P pE c q  1  P pE q
For events E and F P pE q P pF q  P pE Y F q P pEF q or
P pE Y F q  P pE q P pF q  P pEF q, where EF  E X F .
P pE Y F q  P pE q P pF q if E and F are mutually exclusive.

F. Chipepa February 7, 2024 5 / 51


CHAPTER 1 1.2: Probabilities Defined on Events

Given 3 events E, F, and G


P pE Y F Y Gq  P ppE Y F qGq which is equivalent to
P pE Y F q P pGq  P ppE Y F qGq.
P pE Y F Y Gq  P pE q P pF q  P pEF q P pGq  P pEG Y F Gq
 P pE q P pF q  P pEF q P pGq  P pEGq  P pF Gq P pEGF Gq
 P pE q P pF q P pGq  P pEF q  P pEGq  P pF Gq P pEF Gq
Which can be shown by induction
° for n°events E1 , E2 , ..., En that
P pE1 Y E2 Y ... Y En q  i1 P pEi q  i j P pEi Ej q
° °
i j k P pEi Ej Ek q i j k l P pEi Ej Ek El q ... p1q P pE1 E2 ...En q.
n 1

F. Chipepa February 7, 2024 6 / 51


CHAPTER 1 1.2: Probabilities Defined on Events

Homework 1

A drug has the following information:


a) There is a 10% chance of experiencing headache (H).
b) There is a 15% chance of experiencing nausea (N).
c) There is a 5% chance of experiencing both side effects.
What is the probability of experiencing at least one side effect?

F. Chipepa February 7, 2024 7 / 51


CHAPTER 1 1.3: Conditional Probability

Conditional Probability

Often we want to calculate the probability that an event E occurs given


that an event F occurs. We use the notation P pE |F q. This defined when
P pF q ¡ 0.
The rule for conditional probability
P pE |F q  P pPEpF
XF q
q
Example 1.1: Suppose cards numbered one through ten are placed in a
hat, mixed up, and then one of the cards is drawn. If we are told that the
number on the drawn card is at least five, then what is the conditional
probability that it is ten?
Solution: Let E denote the event that the number of the drawn card is
ten, and let F be the event that it is at least five. The desired probability
id P pE |F q. However, P pE X F q  E since the number of the card will be
both ten and at least five if and only if it is the number 10. Hence,
P pE |F q  106  6.
1
1
10

F. Chipepa February 7, 2024 8 / 51


CHAPTER 1 1.3: Conditional Probability

Homework 2

1. Suppose an urn contains seven black balls and five white balls. We draw
two balls from the urn without replacement. Assuming that each ball in
the urn is equally likely to be drawn, what is the probability that both
drawn balls are black?
2. Suppose that each of three men at a party throws his hat into the center
of the room. The hats are first mixed up and then each man randomly
selects a hat. What is the probability that none of the three men selects
his own hat?

F. Chipepa February 7, 2024 9 / 51


CHAPTER 1 1.4: Independent Events

Independent Events

Two events are said to be independent if


P pE X F q  P pE qP pF q.
Implications:
P pE |F q  P pE q and P pF |E q  P pF q.
Pairwise independence does not necessarily imply joint independence.
Example 1.2: Let a ball be drawn from an urn containing four balls,
numbered 1, 2, 3, 4. Let E  t1, 2u, F  t1, 3u, G  t1, 4u. If all four
outcomes are assumed equally likely, then
P pE X F q  P pE qP pF q  14
P pE X G  P pE qP pGq  14
P pF X Gq  P pF qP pGq  41 .
However, 14  P pE X F s X Gq  P pE qP pF qP pGq.
Hence, even though the events E, F, G are pairwise independent, they are
not jointly independent.

F. Chipepa February 7, 2024 10 / 51


CHAPTER 1 1.5: Bayes’ Formula

Bayes’ Formula

- Let E and F be events.


We may express E  pE X F qYpE X F c q because in order for a point to be
in E, it must either be in both E and F, or it must be in E and not in F.
Since pE X F q and pE X F c q are mutually exclusive, we have that

P pE q  P pE X F q P pE X F c q
 P pE |F qP pF q P pE |F c qP pF c q
 P pE |F qP pF q P pE |F c qp1  P pF qq

The result state that the probability of event E is a weighted average of


the conditional probability of E given that F has occurred and the
conditional probability of E given that F has not occurred, each
conditional probability being given as much weight as the event on which
it is conditioned has of occurring.

F. Chipepa February 7, 2024 11 / 51


CHAPTER 1 1.5: Bayes’ Formula

Example 1.3: Consider two urns. The first contains two white and seven
black balls, and the second contains five white and six black balls. We flip
a fair coin and then draw a ball from the first urn or the second urn
depending on whether the outcome was heads or tails. What is the
conditional probability that the outcome of the toss was heads given that
a white ball was selected?
Solution: Let W be the event that a white ball is drawn, and let H be the
event that the coin comes up heads. The desired probability P pH |W q
may be calculated as follows

P pH X W q
P pH |W q  P pW q
 P pWP|HpWqPq pH q
P pW |H qP pH q
 P pW |H qP pH q P pW |H c qP pH c q
21
 21
92
5 1  22
67
92 11 2

F. Chipepa February 7, 2024 12 / 51


CHAPTER 1 1.5: Bayes’ Formula

Suppose
”n that F1 , F2 , ..., Fn are mutually exclusive events such that
i 1 i
F  S. In other words, exactly one of the events F1 , F2 , ..., Fn will
occur.”nBy writing
E  i1 EFi and using the fact that the events EFi , i  1, 2, ..., n, are
mutually exclusive, we obtain that
¸
n
P pE q  P pEFi q

i 1
¸n
 P pE |Fi qP pFi q

i 1

Suppose now E has occurred and we are interested in determining which


one of the Fj also occurred. We have

P pFj |E q  P EFj
P pE q
P pE |Fi qP pFi q
 °n
i1 P pE |Fi qP pFi q

This is called Bayes’ Formula


F. Chipepa February 7, 2024 13 / 51
CHAPTER 1 1.5: Bayes’ Formula

Homework 3

1. A laboratory blood test is 95% effective in detecting certain disease when


it is, in fact, present. However, the test also yields a ”false positive” result
for 1% of the healthy persons tested. (That is if a healthy person is
tested, then, with probability 0.01, the test result will imply he has the
disease.) If 0.5% of the population actually has the disease, what is the
probability a person has the disease given that his test result is positive.
2. If a person is lying, the probability that this is correctly detected by the
polygraph is 0.88, whereas if the person is telling the truth, this is
correctly detected with probability 0.86. Suppose we consider a question
for which 99% of all subjects tell the truth. Our polygraph machine says
a subject is lying on this question. What is the probability that the
polygraph is incorrect?

F. Chipepa February 7, 2024 14 / 51


CHAPTER 2 2.1: Random Variables

Random Variables

Random variables are real-valued functions defined on the sample space


(May be discrete, indicator, or continuous).

F. Chipepa February 7, 2024 15 / 51


CHAPTER 2 2.2: Discrete Random Variables

Discrete Random Variables

A discrete random variable take on a countable set of numbers.


We define for a discrete random variable X, the probability mass function
ppaq given by
ppaq  P tX  au
The probability mass function ppaq is positive for at most a countable
number of values of a. That is, if X assume one of the values x1 , x2 , ...,
then ppxi q ¡ 0, i  1, 2, .... Otherwise ppxq  0, for other values of x.
Since X must take on one of the values xi , we have

ppxi q  1.

i 1

F. Chipepa February 7, 2024 16 / 51


CHAPTER 2 2.2: Discrete Random Variables

The Bernoulli Random Variable

Suppose that a trial, or an experiment, whose outcome can be classified


as either a ”success” or as a ”failure” is performed. If we let X  1 if the
outcome is a success and 0 if it is a failure, then the probability mass
function of X is given by
pp0q  P tX  0u  1  p
pp1q  P tX  1u  p
where p, 0 ¤ p ¤ 1, is the probability that the trial is a success.

F. Chipepa February 7, 2024 17 / 51


CHAPTER 2 2.2: Discrete Random Variables

The Binomial Random Variable

Suppose that n independent trials, each of which results in a ”success”


with probability p and in a ”failure” with probability 1  p, are to be
performed. If X represents the number of successes that occur in the n
trials, then X is said to be a binomial random variable with parameters
pn, pq.
The probability mass function of a binomial random variable having
 pn, pq is given by
parameters
ppiq  ni pi p1  pqni , i  0, 1, 2, ..., n

where ni  pnn!iq!i! .

F. Chipepa February 7, 2024 18 / 51


CHAPTER 2 2.2: Discrete Random Variables

The Geometric Random Variable

Suppose that independent trials, each having probability p of being a


success, are performed until a success occurs.
If we let X be the number of trials required until the first success, then X
is said to be a geometric random variable with parameter p.
Its probability mass function is given by
ppnq  P tX  nu  pp1  pqn1 , n  1, 2, ....

F. Chipepa February 7, 2024 19 / 51


CHAPTER 2 2.2: Discrete Random Variables

The Poisson Random Variable

A random variable X, taking on one of the values 0, 1, 2, . . . , is said to


be a Poisson random variable with parameter λ, if for some λ ¡ 0,
λ i
ppiq  P tX  iu  e i! λ , i  0, 1, 2, ....
An important property of the Poisson random variable is that it may be
used to approximate a binomial random variable when the binomial
parameter n is large and p is small.

F. Chipepa February 7, 2024 20 / 51


CHAPTER 2 2.2: Discrete Random Variables

Homework 4

1. Suppose that an airplane engine will fail, when in flight, with probability
1  p independently from engine to engine; suppose that the airplane will
make a successful flight if at least 50 percent of its engines remain
operative. For what values of p is a four-engine plane preferable to a
two-engine plane?
2. Consider an experiment that consists of counting the number of
α-particles given off in a one-second interval by one gram of radioactive
material. If we know from past experience that, on the average, 3.2 such
α-particles are given off, what is a good approximation to the probability
that no more than two α-particles will appear?

F. Chipepa February 7, 2024 21 / 51


CHAPTER 2 2.3: Continuous Random Variables

Continuous Random Variables

X is a continuous random variable if there exists a nonnegative function


f pxq, defined for all real x P p8, 8q, having the property that for any
set B of real numbers
»
P tX P Bu  f pxqdx.
B

The function f pxq is called the probability density function of the random
variable X.

F. Chipepa February 7, 2024 22 / 51


CHAPTER 2 2.3: Continuous Random Variables

Uniform Random Variable


A random variable is said to be uniformly distributed over the interval (0,
1) if its probability density function is given by

"
f p xq 
1, 0 x 1
0, otherwise
In general, we say that Xis a uniform random variable on the interval
pα, β q if its pdf is given by
" 1
f pxq  
β α α x β
0 otherwise
Example 1.4: Calculate the cumulative distribution function of a random
variable uniformly distributed over pα, β q.
$
& 0, a¤α
F pxq  
a α

β α, α a β
α¥β
%
1,

F. Chipepa February 7, 2024 23 / 51


CHAPTER 2 2.3: Continuous Random Variables

Exponential Random Variables

A continuous r.v whose pdf is given for some λ ¡ 0, by


"
λeλx , if x ¥ 0
f p xq 
0, if x 0

is said to be an exponential r.v with parameter λ.

F. Chipepa February 7, 2024 24 / 51


CHAPTER 2 2.3: Continuous Random Variables

Gamma Random Variables

A continuous random variable whose density is given by


#
λeλx λx
p qα1 , if x ¥ 0
f pxq  p q
Γ α
0, if x 0

for some λ ¡ 0, α ¡ 0 is said to be a gamma r.v with parameters α, λ.


The quantity Γpαq is called the gamma function is defined by
»8
Γp α q  ex xα1 dx.
0

F. Chipepa February 7, 2024 25 / 51


CHAPTER 2 2.3: Continuous Random Variables

Normal Random Variable

We say that X is a normal random variable with parameters µ and σ 2 if


the density of X is given by
pxµq2
f pxq  ?1 e 2σ 2 , 8 x 8. (1)
2πσ
The density function is a bell-shaped curve that is symmetric around µ.
If X is normally distributed with parameters µ and σ 2 then Y  αX β
is normally distributed with parameters αµ β and α2 σ 2 .

F. Chipepa February 7, 2024 26 / 51


CHAPTER 2 2.3: Continuous Random Variables

Proof: Suppose that α ¡ 0 and note that FY p.q , the cdf of the r.v Y , is
given by
FY paq  P tY ¤ au
 P tαX β ¤ au
" *
aβ
 P X¤
α

aβ
 FX
α
» paβ q{α
pxµq2
 ?1 e 2σ 2 dx
»
8 2πσ
" *

a
? 1 exp pv  pαµ β qq2
dv
8 2πασ 2α2 σ 2
where the last equality is³ obtained by the change of variables v  αx β
However, since FY paq  8 fY pv qdv, it follows that the pdf fY p.q is given
a

by " *
fY pv q  ?
1
exp
p v  pαµ β qq2
, 8 v 8.
2πασ 2α2 σ 2
Hence, Y is normally distributed with parameters αµ β and α2 σ 2 .
F. Chipepa February 7, 2024 27 / 51
CHAPTER 2 2.4: Expectation of a Random Variable

2.4.1: Discrete Case

The expected value of X is defined by


¸
E rX s  xppxq
p q¡0
x:p x

Expectation of a Bernoulli Random Variable


Example: Calculate E[X] when X is a Bernoulli r.v with parameter p.
Solution: Since P(0)=1-p, p(1)=p, we have
E rX s  0p1  pq 1ppq  p.
Thus, the expected number o successes in a single trial is just the
probability that the trial will be a success.

F. Chipepa February 7, 2024 28 / 51


CHAPTER 2 2.4: Expectation of a Random Variable

Expectation of Binomial Random Variable

¸
n
E rX s  iP piq

i 0

¸n
 p p1  pqni
n i
i

i 0
i
¸n
 pi p1  pqni
in!
i1
p n  i q!i!


¸
n
pn  1q! pi1 p1  pqni
np
i1
pn  iq!pi  1q!
1 n  1

 np pk p1  pqn1k

k 0
k
 nprp p1  pqsn1
 np,
where the second from the last equality follows by letting k  i  1.
F. Chipepa February 7, 2024 29 / 51
CHAPTER 2 2.4: Expectation of a Random Variable

Expectation of Geometric Random Variable


E rX s  npp1  pqn1

n 1

 p nq n1 pwhere q  1  pq
n 1
8̧ d
 p pqn q
n1
dq


 p
d
qn
dq


n 1

 p
d
dq 1  q
q

 1
p1  q q2
 1
p
.
F. Chipepa February 7, 2024 30 / 51
CHAPTER 2 2.4: Expectation of a Random Variable

Expectation of a Poisson Random Variable

8̧ iλi eλ
E rX s 
i0
i!
8̧ λi eλ

 pi  1q!
i 1
8̧ λi1
 λeλ
i 1
pi  1q!
8̧ λk
 λeλ
k 0
k!
 λeλ eλ
 λ
°8
where we use the identity 
λk
k 0 k!  eλ .
F. Chipepa February 7, 2024 31 / 51
CHAPTER 2 2.4: Expectation of a Random Variable

2.4.2: The Continuous Case

For a continuous r.v X »8


E rX s  xf pxqdx
8
Expectation of a Uniform Random Variable
»β
E rX s  x
α β  α
dx

β 2  α2
 2pβ  αq

 β α
2
.

F. Chipepa February 7, 2024 32 / 51


CHAPTER 2 2.4: Expectation of a Random Variable

Expectation of an Exponential Random Variable

»8
E rX s  xλeλx dx
o

integrating by parts (dv  λeλx , u  x) yields


»8
xeλx 8

E rX s  0
eλx dx
o
8
eλx 
 0
λ 

0

 1
λ
.

F. Chipepa February 7, 2024 33 / 51


CHAPTER 2 2.4: Expectation of a Random Variable

Expectation of a Normal Random Variable

»8
pxµq2
E rX s  ?1 xe 2σ 2 dx
2πσ 8

Writing x as px  µq µ yields
»8 »8
pxµq2 pxµq2
E rX s  ?1 px  µqe 2σ 2 dx µ?
1
e 2σ 2 dx
2πσ 8 2πσ 8

Letting y  x  µ leads to
»8 »8
E rX s  ?1 ye 2σ2 dx f pxqdx
y2
µ
2πσ 8 8

where f pxq is the ³normal density. By symmetry, the first integral must be 0,
8
and so E rX s  µ 8 f pxqdx  µ.
F. Chipepa February 7, 2024 34 / 51
CHAPTER 2 2.4: Expectation of a Random Variable

Expectation of a Function of a Random Variable


Proposition 2.1 (a) If X is a discrete random variable with probability
mass function ppxq, then for any real-valued function g,
¸
E rg pX qs  g pxqppxq
p q¡0
x:p x

(b) If X is a continuous random variable with probability density function


f pxq, then for any real-valued function g,
»8
E rg pX qs  g pxqf pxqdx
8
Corollary 2.2 If a and b are constants, then E raX bs  aE rX s b
Proof: For discrete case,
¸
E raX bs  pax bqppxq
x:p xp q¡0
¸ ¸
 a xppxq b ppxq
p q¡0
x:p x p q¡0
x:p x

 aE rX s b
F. Chipepa February 7, 2024 35 / 51
CHAPTER 2 2.4: Expectation of a Random Variable

Expectation of a Function of a Random Variable

For continuous case,


»8
E raX bs  pax bqf pxqdx
8
»8 »8
 a xf pxqdx b f pxqdx
8 8
 aE rX s b

E rX s is referred as the mean or the first moment of X. The quantity


E rX n s is called the nth moment of X.
By proposition 2.1, we note that
# °
E rX n
s ³ 8x:ppxnq¡0
pq
xn p x , if X is discrete
8 x f pxqdx, if X is continuous

V arpX q  E rpX  E rX sq2 s.


F. Chipepa February 7, 2024 36 / 51
CHAPTER 2 2.4: Expectation of a Random Variable

Variance of the Normal Random Variable


Recall that E rX s  µ. We have
V arpX q  E rpX » µq2 s
8 p q
 ?1 px  µq2 e
x µ 2
2σ 2 dx
2πσ 8
Substituting y  px  uq{σ yields
»8
?σ y 2 e
2
V arpX q  y2
2 dy
2π 8
Integrating by parts pu  y, dv  ye y2
2 dy q
 8 »8

?σ  e
2
V arpX q   ye 2 
y2  y2
2 dy

8 8
»8
 ?σ2  y2
e 2 dy
2π 8
 σ2
F. Chipepa February 7, 2024 37 / 51
CHAPTER 2 2.4: Expectation of a Random Variable

Homework 5

1. Show that V arpX q  E rpX  µq2 s can be expressed as


V arpX q  E rX 2 s  µ2 or V arpX q  E rX 2 s  pE rX sq2 .
2. Find V arpX q
i. Bernoulli random variable.
ii. binomial random variable.
iii. Poisson random variable.
iv. Geometric random variable.
v. uniform random variable.
vi exponential random variable.

F. Chipepa February 7, 2024 38 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

Joint Distribution Functions

Given two discrete random variables X and Y , the joint probability mass
function of X and Y is given by
ppx, y q  P tX  x, Y  y u.
of X may be obtained from ppx, y q by
The pmf °
pX pxq  y:ppx,yq¡0 ppx, y q.
Similarly,°
pY py q  x:ppx,yq¡0 ppx, y q.
We say X and Y are jointly continuous if there exists a function f px, y q,
defined for all real x and y, having the property that for all sets A and B
of real numbers ³ ³
P tX P A, y P B u  B A f px, y qdxdy
The marginal
³ 8 pdfs of X and Y given³ 8 by
fX pxq  8 f px, y qdy and fY py q  8 f px, y qdx.

F. Chipepa February 7, 2024 39 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

A variation of proposition 2.1 states that if X and Y are random


variables and g is a function of two variables, then
¸¸
E rg pX, Y qs  g px, y qppx, y q in the discrete case
y x
»8 »8
 g px, y qf px, y qdxdy in the continuous case
8 8
For example, if g pX, Y q  X Y, then in the continuous case ,
»8 »8
E rX Ys  px y qf px, y qdxdy
8 8
»8 »8 »8 »8
 xf px, y qdxdy yf px, y qdxdy
8 8 8 8
 E rX s E rY s
The same result holds in the discrete case.
For constants a and b we have E raX bY s  aE rX s bE rY s
If X1 , X2 , ..., Xn are n random variables, then for the n constants
a1 , a2 , ..., an
E ra1 X1 a2 X2 ... an Xn s  a1 E rX1 s a2 E rX2s ... an E rXn s
F. Chipepa February 7, 2024 40 / 51
CHAPTER 2 2.5: Jointly Distributed Random Variables

Independent Random Variables


The rvs X and Y are said to be independent if, for all a, b,
P tX ¤ a, Y ¤ bu  P tX ¤ auP tY ¤ bu
That is F pa, bq  FX paqFY pbq
When X and Y are discrete, the condition of independence reduces to
ppx, y q  pX pxqpY py q, while if X and Y are jointly continuous,
independence reduces to f px, y q  fX pxqfY py q
Proposition 2.3: If X and Y are independent, then for any functions h
and g E rg pX qhpY qs  E rg pX qsE rhpY qs
Proof: Suppose that X and Y are joint. Then
»8 »8
E rg pX qhpY qs  g pxqhpy qf px, y qdxdy
8 8
»8 »8
 g pxqhpy qfX pxqfY py qdxdy
8 8
»8 »8
 g pxqfX pxqdx hpy qfY py qdy
8 8
 E rg pX qsE rhpY qs
F. Chipepa February 7, 2024 41 / 51
CHAPTER 2 2.5: Jointly Distributed Random Variables

Covariance and Variance of Sums of Random Variables

The covariance- of any two random variables X and Y , denoted by


Cov pX, Y q, is defined by

Cov pX, Y q  E rpX E rX sqpY  E rY sqs


 E rXY  Y E rX s  XE rY s E rX sE rY ss
 E rXY s  E rY sE rX s  E rX sE rY s E rX sE rY s
 E rXY s  E rX sE rY s

Note that if X and Y are independent, then by proposition 2.3 it follows


that Cov pX, Y q  0

F. Chipepa February 7, 2024 42 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

Example: The joint density function of X, Y is

1 py { q,
f px, y q  e x y
,0 x, y 8
y

(a) Verify that the preceding is a joint density function.


(b) Find Cov pX, Y q.
Solution: To show that f px, y q is a joint density function we need to show
it
³ 8is ³nonnegative, which is immediate, and then that
8 f px, yqdydx  1. We prove the latter as follows:
8 8
»8 »8 »8»8
1 py x{yq
f px, y qdydx  e dydx
8 8 0
»8
0 y
»8
ey
1 x{y
 e dydx
0 0 y
»8
 ey dy
0
 1

F. Chipepa February 7, 2024 43 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

To obtain Cov px, y q, note that the marginal density function of Y is


»8
fY py q  ey
1 x{y
e dx  ey
0 y

Thus, Y is an exponential random variable with parameter 1. Therefore,


E rY s  1
We compute- E rX s and E rXY s as follows:
»8 »8
E rX s  xf px, y qdydx
8 8»
»8 8x
 e y ex{y dydx
0 0 y
³8
Now, 0 xy ex{y dx is the expected value of an exponential random
variable ³with parameter p1{y q and thus equal to y. Consequently,
8
E rX s  0 yey dy  1

F. Chipepa February 7, 2024 44 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

Also
»8 »8
E rXY s  xyf px, y qdydx
8 8 »
»8 8x
 yey ex{y dydx
0 0 y
»8
 y 2 ey dy
0

Integration by parts (dv  ey dy, u  y 2 ) gives


»8
E rXY s  y 2 ey dy
0
»8
 y2 ey|8 0 2yey dy
0
 2E rY s  2
Consequently, Cov pX, Y q  E rXY s  E rX sE rY s  2  p1qp1q  1.

F. Chipepa February 7, 2024 45 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

Properties of Covariance

For any random variables X, Y, Z and constant c


1. Cov pX, X q  V arpX q
2. Cov pX, Y q  Cov pY, X q
3. Cov pcX, Y q  c Cov pX, Y q
4. Cov pX, Y Z q  Cov pX, Y q Cov pX, Z q
The fourth property generalizes to give the following result:

¸
n ¸
m ¸
n ¸
m
Cov Xi , Yj  Cov pXi , Yj q
i 1 
j 1  
i 1j 1

F. Chipepa February 7, 2024 46 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

We can obtain a useful expression for the variance of the sum of random
variables as follows:
 
¸
n ¸
n ¸
n
V ar Xi  Cov Xi , Xj
i 1  
i 1 
i j
¸
n ¸
n
 Cov pXi , Xj q
 
i 1j 1
¸n ¸
n ¸
 Cov pXi , Xi q Cov pXi , Xj q
i 1   
i 1j i
¸n ¸
n ¸
 V arpXi q 2 Cov pXi , Xj q
i 1  
i 1j i

If Xi , i  1, 2, ..., n are independent random variables, we get



¸
n ¸
n
V ar Xi  V arpXi q

i 1 
i 1

F. Chipepa February 7, 2024 47 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

Definition 2.1: If X1 , X2 , ..., Xn are independent


°n and identically
distributed, then the random variable X̄  i1 Xi {n is called the
sampled mean.
Proposition 2.4: Suppose that X1 , X2 , ..., Xn are independent and
identically distributed with expected value µ and variance σ 2 . Then
(a) E rX̄ s  µ
(b) V arpX̄ q  σ 2 {n
(c) Cov pX̄, Xi  X̄ q  0, i  1, 2, ..., n.
°n
Proof: (a) E rX̄ s  n i1 E rXi s  µ
1
2 °n  °n
(b) V arpX̄ q  n1 V ar p i1 Xi q  1 2
n  V ar pXi q 
i 1
σ2
n
(c)
Cov pX̄, Xi  X̄ q  Cov pX̄, Xi q  Cov pX̄, X̄ q

¸
 1
Cov Xi Xj , Xi  V arpX̄ q
n j i 

¸
 1
Cov pXi , Xi q
1
Cov Xi , Xj  V arpX̄ q
n n 
j i

σ2 2
 n
 σn  0
F. Chipepa February 7, 2024 48 / 51
CHAPTER 2 2.5: Jointly Distributed Random Variables

Joint Probability Distribution of Functions of Random


Variables

Let X1 and X2 be jointly continuous random variables with joint


probability density function f px1 , x2 q.
It is sometimes necessary to obtain the joint distribution of the random
variables Y1 and Y2 that arise as functions of X1 and X2 .
Specifically, suppose that Y1  g1 pX1 , X2 q and Y2  g2 pX1 , X2 q for some
functions g1 and g2 .
Assume that the equations g1 and g2 satisfy the following conditions:
1. The equations y1  g1 px1 , x2 q and y2  g2 px1 , x2 q can be uniquely solved
for x1 and x2 in terms of y1 and y2 with solutions given by, say,
x1  h1 py1 , y2 q, x2  h2 py1 , y2 q.
2. The functions g1 and g2 have continuous partial derivatives at all points
px1 , x2 q and are such that the following 2  2 determinant
 Bg1 B g1 
J p x1 , x 2 q 
 BBxg1 BBxg2   Bg1 Bg2  BBxg Bg2 0 at all points px1 , x2 q.
Bx1 Bx2 B x1
1

B x1 B x2 
2 2 2


F. Chipepa February 7, 2024 49 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

Under these two conditions it can be shown that the random variables Y1
and Y2 are jointly continuous with joint density function given by

fY1 ,Y2 py1 , y2 q  fX1 ,X2 px1 , x2 q|J px1 , x2 q|1 (2)

where x1  h1 py1 , y2 q, x2  h2 py1 , y2 q.


Example: If X and Y are independent gamma random variables with
parameters pα, λq and pβ, λq, respectively, compute the joint density of
U  X Y and V  X {pX Y q.

F. Chipepa February 7, 2024 50 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

Solution: The joint density of X and Y is given by

λeλx pλxqα1 λeλy pλy qβ 1


fX,Y px, y q  Γpαq Γpβ q

eλpx yq xα1 y β 1
α β
 λ
ΓpαqΓpβ q

Now, if g1 px, y q  x y, g2 px, y q  x{px y q, then


Bg1  Bg1  1, Bg2  y 2 , Bg2   x 2 , and so
Bx By Bx px yq By px y q

 
J px1 , x2 q   x1y
 1 1 
 y
 px yq2  x 
px y q2 
Finally, because the equations u  x y, v  x{px y q have as their
solutions x  uv, y  up1  v q, we that

fU,V pu, v q  fX,Y ruv, up1  v qsu


λeλu pλuqα β 1 v α1 p1  v qβ 1 Γpα βq
 Γpα β q ΓpαqΓpβ q

F. Chipepa February 7, 2024 51 / 51


CHAPTER 2 2.5: Jointly Distributed Random Variables

Hence, X Y and X pX Y q are independent, with X Y having a


gamma distribution with parameters pα β, λq and pX {pX Y qq having
density function

Γpα β q α1
fV pv q 
ΓpαqΓpβ q
v p1  vqβ1 , 0 v 1. (3)

This called the beta density with parameters pα, β q.

F. Chipepa February 7, 2024 52 / 51

You might also like