Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

CH 3 Random Variables and Probability Distributions

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 32

Chapter 3

Random Variables and


Probability Distributions
Chapter 3.1 Concept of a Random Variable

Concept of a Random Variable


 Random variable is a function that associates a real number with
each element in the sample space.
 In other words, random variable is a numerical description of the
outcome of an experiment, where each outcome gets assigned a
numerical value.
 A capital letter, say X, is used to denotes a random variable and its
corresponding small letter, x in this case, for one of its values.
Chapter 3.1 Concept of a Random Variable

Concept of a Random Variable


The sample space giving a detailed description of each possible
outcome when three electronic components are tested may be
written as
S  DDD, DDN , DND, DNN , NDD, NDN , NND, NNN 
One is concerned with the number of defectives that occurs. Thus
each point in the sample space will be assigned a numerical value of
0, 1, 2, or 3.
Then, the random variable X assumes the value 2 for all elements in
the subset
E  DDN , DND, NDD

Two balls are drawn in succession without


replacement from an urn containing 4 red
balls and 3 black balls. The possible outcomes
and the values y of the random variable Y,
where Y is the number of red balls are
Chapter 3.1 Concept of a Random Variable

Sample Space and Random Variable


 If a sample space contains a finite number of possibilities or an
unending sequence with as many elements as there are whole
numbers, it is called a discrete sample space.
 If a sample space contains an infinite number of possibilities equal
to the number of points on a line segment, it is called a
continuous sample space.

 A random variable is called a discrete random variable if its set


of possible outcomes is countable.
 A random variable is called a continuous random variable if it
can take on values on a continuous scale.

If X is the random variable assigned to the waiting time, in minute,


for a bus at a bus stop, then the random variable X may take on all
values of waiting time x, x ≥ 0.
In this case, X is a continuous random variable.
Chapter 3.2 Discrete Probability Distributions

Discrete Probability Distributions


 Frequently, it is convenient to represent all the probabilities of a
random variable X by a formula.
 Such a formula would necessarily be a function of the numerical
values x, denoted by f(x), g(x), r(x), and so forth.
 For example,
f ( x ) P ( X  x )

 The set of ordered pairs (x, f(x)) is a probability function,


probability mass function, or probability distribution of the
discrete random variable X if, for each possible outcome x,
1. f ( x) 0

2.  f ( x) 1
x

3. P( X x)  f ( x)
Chapter 3.2 Discrete Probability Distributions

Discrete Probability Distributions


In the experiment of tossing a fair coin
twice, the random variable X represents
how many times the head turns up. The
possible value for x of X and their
probability can be summarized as
Chapter 3.2 Discrete Probability Distributions

Discrete Probability Distributions


A shipment of 20 similar laptop computers to a retail outlet contains
3 that are defective. If a school makes a random purchase of 2 of
these computers, find the probability distribution for the number of
defectives.

Let X be a random variable, whose value x are the possible numbers


of defective computers purchased by the school.
C0 17 C2 136
f (0) P( X 0)  3

20 C2 190
C1 17 C1 51
f (1) P ( X 1)  3

20 C2 190
C2 17 C0 3
f (2) P ( X 2)  3

20 C2 190
Chapter 3.2 Discrete Probability Distributions

Discrete Probability Distributions


 There are many problems where we may wish to compute the
probability that the observed value of a random variable X will be
less than or equal to some real number x.
 The cumulative distribution F(x) of a discrete random variable X
with probability distribution f(x) is
F ( x) P ( X  x)  f (t ) for    x  
t x

 Example of a probability  Example of a cumulative


distribution distribution
Chapter 3.3 Continuous Probability Distributions

Continuous Probability Distributions


 In case the sample space is continuous, there can be unlimited
number of possible value for the samples.
 Thus, it is more meaningful to deal with an interval rather than a
point value of a random variable.
 For example, it does not make sense to know the probability of
selecting person at random who is exactly 164 cm tall. It will be
more useful to talk about the probability of selecting a person who
is at least 163 cm but not more than 165 cm.

 We shall concern ourselves now with computing probabilities for


various intervals of continuous random variables such as
P(a < X < b), P(W ≥ c), P(U ≤ d) and so forth.
 Note that when X is continuous
P ( X a ) 0  Probability of a point value is zero

P (a  X b) P (a  X  b)  P ( X b) P (a  X  b)
Chapter 3.3 Continuous Probability Distributions

Continuous Probability Distributions


 In dealing with continuous variables, the notation commonly used
is f(x) and it is usually called the probability density function,
or the density function of X.
 For most practical application, the density functions are continuous
and differentiable.
 Their graphs may take any forms, but since it will be used to
represent probabilities, the density function must lie entirely above
the x axis to represent positive probability.

f(x) f(x) f(x)

x x x
Chapter 3.3 Continuous Probability Distributions

Continuous Probability Distributions


 A probability density function is constructed so that the area under
its curve bounded by the x axis is equal to 1 when computed over
the range of X for which f(x) is defined.
 In the figure below, the probability that X assumes a value
between a and b is equal to the shaded area under the density
function between the ordinates at x = a and x = b.

b
P (a  X  b) f ( x )dx
a
Chapter 3.3 Continuous Probability Distributions

Continuous Probability Distributions


 The function f(x) is a probability density function for the
continuous random variable X, defined over the set of real
numbers R if
1. f ( x) 0, for all x  R

2.  f ( x)dx 1
 b
3. P (a  X  b) f ( x )dx
a
Chapter 3.3 Continuous Probability Distributions

Continuous Probability Distributions


Suppose that the error in the reaction temperature, in °C, for a
controlled laboratory experiment is a continuous random variable X
having the probability density function
 x2

f ( x)  3 ,  1  x  2
0, elsewhere

(a) Verify whether  f ( x)dx 1



(b) Find P(0 < X ≤ 1)

 2 2 3 2
x x 8  1
(a)  f ( x)dx   dx      1
 1
3 9 1 9  9 
1 2 3 1
x x 1
(b) P (0  X 1)  
0
3
dx 
9

9
0
Chapter 3.3 Continuous Probability Distributions

Continuous Probability Distributions


 The cumulative distribution F(x) of a continuous random
variable X with density function f(x) is
x
F ( x) P ( X  x)   f (t )dt , for    x  


For the density function in the last example, find F(x) and use it to
evaluate P(0 < X ≤ 1).

x x 3 x
t 2
t x3  1
F ( x)   f (t )dt   dt   , for  1 x  2
 1
3 9 1
9

0, x  1
 x 3  1
F ( x)  ,  1  x 2
 9
1, x 2

2 1 1
P (0  X 1) F (1)  F (0)   
9 9 9
Chapter 3.4 Joint Probability Distributions

Joint Probability Distributions


 If X and Y are two discrete random variables, the probability
distribution for their simultaneous occurrence can be represented
by a function with values f(x, y) for any pair of values (x, y) within
the range of the random variables X and Y.
 Such function is referred to as the joint probability distribution
of X and Y.

 The function f(x, y) is a joint probability density function or


joint probability distribution function of the discrete random
variables X and Y if
1. f ( x, y ) 0, for all ( x, y )  R

2.   f ( x, y) 1
x y

3. P ( X  x, Y  y )  f ( x, y )

For any region A in the xy plane, P  ( X , Y )  A 


  f ( x, y )
A
Chapter 3.4 Joint Probability Distributions

Joint Probability Distributions


Two ballpoint pens are selected at random from a box that contains 3
blue pens, 2 red pens, and 3 green pens. If X is the number of blue
pens selected and Y is the number of red pens selected, find
(a) the joint probability function f(x, y)
(b) P[(X, Y)  A], where A is the region {(x, y)|x + y ≤ 1}

3 C x 2 C y 3 C2 x  y
(a) f ( x, y )  ,
C2 8
for x 0,1, 2; y 0,1, 2; 0  x  y 2

(b) P  ( X , Y )  A P ( X  Y 1)
 f (0, 0)  f (0,1)  f (1, 0)
3 3 9
  
28 14 28
9

14
Chapter 3.4 Joint Probability Distributions

Joint Probability Distributions


 The function f(x, y) is a joint probability density function of the
continuous random variables X and Y if
1. f ( x, y ) 0, for all ( x, y )  R
 
2.  f ( x, y) dxdy 1


3. P  ( X , Y )  A  f ( x, y) dxdy
A
For any region A in the xy plane.
Chapter 3.4 Joint Probability Distributions

Joint Probability Distributions


A privately owned business operates both a drive-in facility and a
walk-in facility. On a randomly selected day, let X and Y, respectively,
be the proportions of the time that the drive-in and the walk-in
facilities are in use, and suppose that the joint density function of
these random variables is
 52 (2 x  3 y ), 0  x 1, 0  y 1
f ( x, y ) 
0, elsewhere

(a) Verify that f(x, y) is a joint density function.


(b) Find P[(X, Y)  A], where A is {(x, y)|0 < x < 1/2, 1/4 < y < 1/2}.

  1 1 1 x 1
2 2 2 6 
(a)   f ( x, y ) dxdy  (2 x  3 y ) dxdy  x  yx  dy
y   x   0 0
5 0
5 5  x 0
1 1
2 6  2 6 2
  y  dy  y  y
0
5 5  5 10 0
2 6
  1
5 10
Chapter 3.4 Joint Probability Distributions

Joint Probability Distributions


(b) Find P[(X, Y)  A], where A is {(x, y)|0 < x < 1/2, 1/4 < y < 1/2}.
P  ( X , Y )  A P (0  X  12 , 14  Y  12 )
12 12 12 x 1 2
2 2 2 6 
   (2 x  3 y ) dxdy
5
   x  yx 
1 4
5 5  x 0
dy
y 1 4 x 0

12 12
 1 3  1 3 2
   y  dy  y  y
1 4
10 5  10 10 1 4

1 1 1 3 1 1 
      
10  2 4  10  4 16 
13

160
Chapter 3.4 Joint Probability Distributions

Marginal Probability Distributions


 The marginal probability distribution functions of X alone and
of Y alone are
g ( x)  f ( x, y ) and h( y )  f ( x, y )
y x
for the discrete case, and
 
g ( x)   f ( x, y ) dy and h( y )   f ( x, y ) dx
 
for the continuous case.

 The term marginal is used here because, in discrete case, the


values of g(x) and h(y) are just the marginal totals of the
respective columns and rows when the values of f(x, y) are
displayed in a rectangular table.
Chapter 3.4 Joint Probability Distributions

Marginal Probability Distributions


Show that the column and row totals from the “ballpoint pens”
example give the marginal distribution of X alone and of Y alone.
2
g (0)  f (0, y )  f (0, 0)  f (0,1)  f (0, 2)
y 0

3 3 1 5
   
28 14 28 14
2
g (1)  f (1, y )  f (1, 0)  f (1,1)  f (1, 2)
y 0

9 3 15
  0 
28 14 28
2  It is found that the values of
g (2)  f (2, y )  f (2, 0)  f (2,1)  f (2, 2) g(x) are just the column totals
y 0 of the table above.
3 3  In similar manner we could
 00  show that the values of h(y) are
28 28 given by the row totals.
Chapter 3.4 Joint Probability Distributions

Marginal Probability Distributions


Find f(x) and h(y) for the joint density function of the “drive-in walk-in
facility” example.

 1
2
g ( x)   f ( x, y ) dy  (2 x  3 y ) dy
 0
5
1
4 6  4 3
 xy  y 2   x 
5 10  0 5 5

 1
2
h( y )   f ( x, y ) dx  (2 x  3 y ) dx
 0
5
1
2 6  2 6
 x 2  yx    y
5 5 0 5 5
Chapter 3.4 Joint Probability Distributions

Conditional Probability Distributions


 Let X and Y be two random variables, discrete or continuous. The
conditional probability distribution function of the random
variable Y, given than X = x, is
f ( x, y )
f ( y x)  , g ( x)  0
g ( x)
Similarly, the conditional distribution of the random variable X,
given that Y = y, is
f ( x, y )
f ( x y)  , h( y )  0
h( y )
Chapter 3.4 Joint Probability Distributions

Conditional Probability Distributions


 If one wished to find the probability that the discrete random
variable X falls between a and b when it is known that the discrete
variable Y = y, we evaluate
P (a  X  b Y  y )  f ( x y )
x

where the summation extends over all available values of X


between a and b.

 When X and Y are continuous, we can find the probability that X


lies between a and b by evaluating
b
P(a  X  b Y  y ) f ( x y ) dx
a
Chapter 3.4 Joint Probability Distributions

Conditional Probability Distributions


Referring back to the “ballpoint pens” example, find the conditional
distribution of X, given that Y = 1, and use it to determine
P(X = 0 | Y = 1).

f ( x, y )
f ( x y) 
h( y )
f ( x,1)
f ( x 1)  , x 0,1, 2
h(1)

f (0,1) 3 14 1
f (0 1)   
h(1) 37 2

f (1,1) 3 14 1
f (11)   
h(1) 37 2
1
f (2,1) P  X 0 Y 1  f 0 1 
0 2
f (2 1)   0
h(1) 37
Chapter 3.4 Joint Probability Distributions

Conditional Probability Distributions


Given the joint density function
 x(1  3 y 2 )
 , 0  x  2, 0  y  1
f ( x, y )  4
0, elsewhere

find g(x), h(y), f(x|y), and evaluate P(1/4 < X < 1/2|Y = 1/3).

1
 1 2
x(1  3 y )  x( y  y ) 
3
x
g ( x)   f ( x, y ) dy  dy    , 0x2
 0
4  4 0 2
2
2
 2
x(1  3 y )  x (1  3 y ) 
2 2
1 3y2
h( y )   f ( x, y ) dx  dx    , 0  y 1
 0
4  8 0 2

f ( x, y ) x(1  3 y 2 ) 4 x
f ( x y)   2  , 0  x  2, 0  y  1
h( y ) (1  3 y ) 2 2
12 12 12
x x2 3
P(1 4  X  1 2 Y 1 3)  f ( x y ) dx   dx  
14 14
2 4 14
64
Chapter 3.4 Joint Probability Distributions

Statistical Independence
 Let X and Y be two random variables, discrete or continuous, with
joint probability distribution f(x, y) and marginal distributions g(x)
and h(y), respectively. The random variables X and Y are said to be
statistically independent if and only if
f ( x, y )  g ( x ) h ( y )
for all (x, y) within their range.
Chapter 3.4 Joint Probability Distributions

Statistical Independence
Consider the following joint probability density function of random
variables X and Y.
 3x  y
 , 1  x  4, 1  y  2
f ( x, y )  18
0, elsewhere
(a) Find the marginal density functions of X and Y
(b) Are X and Y statistically independent?
(c) Find P(X > 2|Y = 2)
2
 2
3x  y  6 xy  y 2
6x  3 2x  1
(a) g ( x)   f ( x, y ) dy  dy     , 1 x  4
 1
18  36  1 36 12

4
 4
3x  y  3 x  2 yx 
2
45  6 y 15  2 y
h( y )   f ( x, y ) dx 
 dx     , 1 y  2
 1
18  36 1 36 12
Chapter 3.4 Joint Probability Distributions

Statistical Independence
(b) Are X and Y statistically independent?
 2 x  1   15  2 y  3 x  y
g ( x)h( y )     f ( x, y )
 12   12  18
 X and Y are not statistically independent

(c) Find P(X > 2|Y = 2)


4 4 4
f ( x, y ) f ( x, 2)
P( X  2 Y 2) f ( x y ) y 2 dx  dx  dx
2 2
h( y ) y 2 2
h(2)

4 4 4
(3 x  2) 18 2 2 3  28
 dx  (3 x  2) dx   x 2  2 x  
2
(15  4) 12 2
33 33  2 2 33
Chapter 3.4 Joint Probability Distributions

Statistical Independence
 Let X1, X2, ..., Xn be n random variables, discrete or continuous,
with joint probability distribution f(x1, x2, ..., xn) and marginal
distributions f1(x1), f2(x2), ..., fn(xn), respectively. The random
variables X1, X2, ..., Xn are said to be mutually statistically
independent if and only if

for all (x1, x2, ..., xn)) within their range.


Chapter 3.4 Joint Probability Distributions

Statistical Independence
Suppose that the shelf life, in years, of a certain perishable food
product packaged in cardboard containers is a random variable
whose probability density function is given by
e  x , x  0
f ( x) 
0, elsewhere
Let X1, X2, and X3 represent the shelf lives for three of these
containers selected independently and find P(X1<2, 1<X2<3, X3>2)

f ( x1 , x2 , x3 )  f ( x1 ) f ( x2 ) f ( x3 ) e  x1 e  x2 e  x3
3 2
P ( X 1  2,1  X 2  3, X 3  2) e  x1 e  x2 e  x3 dx1dx2dx3
2 1 0
 x1 2  x2 3  x3 
 e e e
0 1 2

(1  e  2 )(e  1  e  3 )(e  2 )


 0.0372
Exercises
1. A game is played with the rule that a counter will move forward one, two,
or four places according to whether the scores on the two dice rolled
differ by three or more, by one or two, or are equal.
Here we define a random variable, M, the number of places moved, which
can take the value 1, 2, or 4. Determine the probability distribution of M.

2. Let the random variable X denote the time until a computer server
connects to your notebook (in milliseconds), and let Y denote the time
until the server authorizes you as a valid user (in milliseconds). Each of
these random variables measures the wait from a common starting time.
Assume that the joint probability density function for X and Y is
2 10 6 e  0.001x  0.002 y , x 0, y 0
f ( x, y ) 
0, elsewhere
(a) Show that X and Y are independent.

(b) Determine P(X > 1000, Y < 1000).

You might also like