02-Random Variables
02-Random Variables
2
Random variable (RV) X
is a function that assigns a real number X(ω) to each outcome ω in the
sample space Ω of a random expt.
Its domain D is the sample space Ω & its range RX is the the set
of all values taken on by X which is the subset of all real numbers
It is represented by capital letters (like X,Y or W) & any its
particular value by lowercase letter such as x, y or w.
It is important because it provides a compact way of referring to
events via their numerical attributes. For example, if X models
the number of visits to a website, it is much easier to write
P(X > 1000) than to write P(number of visits > 1000).
3
Conditions for a function to be a RV.
A function X to be a RV:
1. It should not be multi-valued.
multi-valued i.e every point in the Ω
must correspond to only one value of the RV.(one-to-one OR
many-to-one)
2. The set {X≤x} shall be an event for any real number x. This set
corresponds to those points ω in Ω for which the RV X(ω)
doesn’t exceed the number x.
The probability of this event, P{X≤x}, is equal to sum of
probability of all the elementary events corresponding to
{X≤x}.Call it Cumulative distribution Function (CDF)
3. P{X=∞}=0 and P{X=-∞}=0 .outcomes chance of being infinity
4
Example:
Consider a random experiment of tossing a fair coin 3 times. The
sequence of heads and tails is noted and the sample space Ω is
given by: {HHH , HHT , HTH , THH , THT , HTT , TTH , TTT}
Let X be the number of heads in three coin tosses.
X assigns each possible outcome ω in the sample space Ω a
number from the set RX={0, 1, 2, 3}.
iii. lim FX ( x) 0
x
6
The Cumulative Distribution Function Cont’d…..
iv. FX ( x) is a non - decreasing function of X , i.e.,
If x1 x2 , then FX ( x1 ) FX ( x2 )
v. P ( x1 X x2 ) FX ( x2 ) FX ( x1 )
vi. P ( X x) 1 FX ( x)
7
Example: Find the cdf of the random variable X which is
defined as the number of heads in three tosses of a fair coin.
Solution:
We know that X takes on only the values 0, 1, 2 and 3 with
probabilities 1/8, 3/8, 3/8 and 1/8 respectively.
Thus, FX(x) is simply the sum of the probabilities of the outcomes
from the set {0, 1, 2, 3} that are less than or equal to x.
0, x 0
1 / 8, 0 x 1
FX ( x) 1 / 2, 1 x 2
7 / 8, 2 x 3
1, x 3 8
Types of Random Variables
There are two basic types of random variables.
i. Continuous Random Variable
A random variable whose cdf, FX(x), is continuous every where and can
be written as an integral of some non-negative function f(x), i.e.,
FX ( x) f (u )du
10
The Probability Mass Function
The probability mass function (pmf)
(pmf of a discrete random
variable X is defined as:
PX ( X xi ) PX ( xi ) FX ( xi ) FX ( xi 1 )
ii. PX ( x) 0, if x xk , k 1, 2, .....
iii. Pk
X ( xk ) 1
11
Calculating the Cumulative Distribution Function
The cdf of a continuous random variable X can be obtained
by integrating the pdf, i.e.,
x
FX ( x) f X (u )du
FX ( x) P
xk x
X ( xk )U ( x xk )
12
Expected Value, Variance and Moments
I. Expected Value (Mean)
The expected value (mean) of a continuous random variable X,
denoted by μX or E(X), is defined as:
X E ( X ) xf X ( x)dx
The variance
2 X of
Vara (discrete
X ) random
( x variable
k
k) 2 P ( xX )is given by:
X X k
14
Expected Value, Variance and Moments Cont’d…..
The standard deviation of a random variable X, denoted by σX, is
simply the square root of the variance, i.e.,
X E ( X X ) 2 Var ( X )
III. Moments
The nth moment of a continuous random variable X is defined as:
E ( X ) x n f X ( x)dx ,
n
n 1
kx , 0 x 1
f X ( x)
0 , otherwise
where k is a constant.
a. Determine the value of k .
b. Find the corresponding cdf of X .
c. Find P(1 / 4 X 1)
d . Evaluate the mean and variance of X .
16
Random Variable Examples Cont’d……
Solution:
1
a.
f X ( x ) dx 1
0
kxdx 1
x2 1
k 1
2 0
k
1
2
k 2
2 x , 0 x 1
f X ( x)
0, otherwise
17
Random Variable Examples Cont’d……
Solution:
b. The cdf of X is given by :
x
FX ( x )
f X (u ) du
Case 1 : for x 0
FX ( x ) 0, since f X ( x ) 0, for x 0
Case 2 : for 0 x 1
x x x
FX ( x ) f X (u ) du 2udu u x2
2
0 0 0
18
Random Variable Examples Cont’d……
Solution:
Case 3 : for x 1
1 1 1
FX ( x ) f X (u ) du 2udu u 1
2
0 0 0
The cdf is given by
0, x0
2
FX ( x ) x , 0 x 1
1, x 1
19
Random Variable Examples Cont’d……
Solution:
c. P (1 / 4 X 1)
i. Using the pdf
1 1
P (1 / 4 X 1) f X ( x) dx 2 xdx
1/ 4 1/ 4
1
P (1 / 4 X 1) x 2
15 / 16
1/ 4
P (1 / 4 X 1) 15 / 16
ii. Using the cdf
P (1 / 4 X 1) FX (1) FX (1 / 4)
P (1 / 4 X 1) 1 (1 / 4) 2 15 / 16
P (1 / 4 X 1) 15 / 16
20
Random Variable Examples Cont’d……
Solution:
d. Mean and Variance
i. Mean
1 1
X E ( X ) xf X ( x)dx 2 x 2 dx
0 0
2 x3 1
X 2/3
3 0
ii. Variance
X 2 Var ( X ) E ( X 2 ) [ E ( X )]2
1 1
E ( X ) x f X ( x) dx 2 x 3 dx 1 / 2
2 2
0 0
X Var ( x) 1 / 2 ( 2 / 3) 2 1 / 18
2
21
Example 2: Let the random variable X have cdf
Find the density and sketch both the cdf and pdf.
Solution
22
Random Variable Examples Cont’d……..
23
Random Variable Examples Cont’d……
Solution:
i. Mean
1
X E( X ) x
k 1
k PX ( xk ) 1 / 3(1 0 1) 0
ii. Variance
X 2 Var ( X ) E ( X 2 ) [ E ( X )]2
1
k X k
2
E( X ) 2
x P ( x ) 1 / 3[( 1) 2
( 0) 2
(1) 2
] 2/3
k 1
X Var ( x) 2 / 3 (0) 2 2 / 3
2
24
Example 3: (mixed rv) : Consider the generalized density
25
Exercises
1 . A random variable X has generalized density
where u is the unit step function , and δ is the Dirac delta function .
(a) Sketch f (t).
(b) Compute P(X = 0) and P(X = 1).
(c) Compute P(0 < X < 1) and P(X > 1).
(d) Use your above results to compute P(0 ≤ X ≤ 1) & P(X ≥ 1).
(e) Compute E[X].
26
3. The continuous random variable X has the pdf given by:
27
5. A r.v. X is defined by the cdf
29
2. Binomial Distribution
A r.v. X is called a binomial r.v. with parameters (n, p) if its pmf is
n k n k
P( X k )
k p q , k 0,1,2, , n.
It is associated with some experiments in which n
independent Bernoulli trials are performed and X represents
the number of successes that occur in the n trials
a Bernoulli r.v. is just a binomial r.v. with parameters (1, p).
Its mean and variance are
31
Example (Binomial):
Binomial)
A homeowner has just installed 20 light bulbs in a new home.
Suppose that each has a probability 0.2 of functioning more than
three months. (a)What is the probability that at least five of these
function more than three months? (b)What is the average number
of bulbs the homeowner has to replace in three months?
Solution: it is reasonable to assume that the light bulbs perform
independently. If X is the number of bulbs functioning more than
three months (success), it has a binomial distribution with n=20
and p=0.2.
32
EXAMPLE : A communications system consists of n components,
each of which will, independently, function with probability p. The
total system will be able to operate effectively if at least one-half of
its components function.
(a) For what values of p is a 5-component system more likely to
operate effectively than a 3-component system?
(b) In general, when is a 2k + 1 component system better than a 2k −
1 component system?
SOLUTION: (a) Because the number of functioning
components is a binomial random variable with parameters
(n, p), it follows that the probability that a 5-component system
will be effective is
33
3. Poisson Distribution
A r.v. X is called a Poisson r.v. with parameter λ(>0) if its pmf is
given by k
P( X k ) e , k 0,1,2, , .
k!
It may be used as an approximation for a binomial r.v. with
parameters (n, p) when n is large and p is small enough so
that np is of a moderate size
Some examples of Poisson r.v.'s include
I. number of telephone calls arriving at a switching center during various time
intervals
II. The number of misprints on a page of a book
III. The number of customers entering a bank during various intervals of time
IV. photoelectric effect and radioactive decay
V. computer message traffic arriving at a queue for transmission.
The mean and variance of the Poisson r.v. X
34
Example (Poisson):
Poisson)
suppose that the probability of a transistor manufactured by a
certain firm being defective is 0.015. What is the probability that
there is no defective transistor in a batch of 100?
o Solution: let X be the number of defective transistors in 100. The
desired probability (binomial) is
36
Example (Geometric):
Geometric)
A driver is eagerly eyeing a precious parking space some
distance down the street. There are five cars in front of the
driver, each of which having a probability 0.2 of taking the
space. What is the probability that the car immediately
ahead will enter the parking space?
Solution:
For this problem, we have a geometric distribution and need
to evaluate it with k=5 and p=0.2.Thus,
37
5. Hypergeometric Distribution
The hypergeometric random variable arises in the following
situation. We have a collection of N items, d of which are
defective. Rather than test all N items, we select at random a
small number of items, say n < N. N Let X denote the number of
defectives out of the n items tested. We show that
38
Exercise: The components of a 6-component system are to be
randomly chosen from a bin of 20 used components. The
resulting system will be functional if at least 4 of its
6 components are in working condition. If 15 of the 20
components in the bin are in working condition, what is the
probability that the resulting system will be functional?
Ans: 0.8687
39
6. Negative Binomial Distribution
A natural generalization of the geometric distribution is the
distribution of random variable X representing the number of
Bernoulli trials necessary for the rth success to occur, where r is a
given positive integer.
In order to determine pX (k) for this case, let A be the event that the
first k -1 trials yield exactly r -1 successes, regardless of their order,
and B the event that a success turns up at the kth trial. Then, owing
to independence,
40
Example (Negative Binomial ): )
a curbside parking facility has a capacity for three cars.
Determine the probability that it will be full within 10
minutes. It is estimated that 6 cars will pass this parking
space within the time span and, on average, 80% of all cars
will want to park there.
Solution: the desired probability is simply the probability
that the number of trials to the third success (taking the
parking space) is less than or equal to 6. If X is this number,
it has a negative binomial distribution with r =3 and p =0.8.
41
Some Special Distributions with their Special application
II. Continuous Probability Distributions
1. Uniform Distribution
When an experiment results in a finite number of “equally
likely” or “totally random” outcomes, we model it with a
uniform random variable
pdf & cdf of X which is constant over interval (a, b) has the form
42
Example 1 (Uniform Distribution)
owing to unpredictable traffic situations, the time required by a
certain student to travel from her home to her morning class
is uniformly distributed between 22 and 30 minutes.
If she leaves home at precisely 7.35 a.m., what is the
probability that she will not be late for class, which begins
promptly at 8:00 a.m.?
Solution: let X be the class arrival time of the student in minutes
after 8:00 a.m. It then has a uniform distribution given by
43
Example 2 (Uniform Distribution)
Solution:
44
2. Exponential Distribution
RV X is called exponential written
f ∼ exp(λ ) with parameter λ >
0 if
46
Example 2(Exponential)
All manufactured devices and machines fail to work sooner or
later. Suppose that the failure rate is constant and the time to
failure (in hours) is an exponential r.v. X with parameter λ.
Measurements show that the probability that the time to failure for
computer memory chips in a given class exceeds l04 hours is .368.
Calculate the value of the parameter λ.
Using the value of the parameter λ determined in part (a), calculate
the time x0, such that the probability that the time to failure is less
than x0, is 0.05.
47
3. Laplace / double-sided exponential
For λ > 0, we write f ∼ Laplace(λ ) if its pdf is
48
Solution. The desired probability can be written as
P({−3 ≤ X ≤−2}∪{0 ≤ X ≤ 3}).
Since these are disjoint events, the probability of the union
is the sum of the individual probabilities.
We therefore need to compute P(−3 ≤ X ≤−2) and P(0 ≤ X ≤ 3).
49
4. Cauchy Distribution
The pdf of a Cauchy random variable X∼ Cauchy(λ ) with
parameter λ > 0 is given by
50
5. Gaussian or normal distribution
The most important density is the Gaussian or normal. For
σ2 > 0, we write X ∼ N(m,σ2) if its pdf is given by
51
Due to central limit theorem, the Gaussian density is a good
approximation for computing probabilities involving a sum of
many independent random variables. For example, let
X = X1+· · ·+Xn,
where the Xi are i.i.d. with common mean m and common
variance σ2. For large n, if the Xi are continuous random
variables, then
53
Similarly, for any a < b,
54
Example 1 (normal): If X is a normal random variable
with mean m = 3 and variance σ2 = 16, find
(a) P{X < 11}; (b) P{X > −1}; (c) P{2 < X < 7}.
55
Example 2 (normal): A production line manufactures 1000-ohm
(R) resistors that have 10 percent tolerance. Let X denote the
resistance of a resistor. Assuming that X is a normal r.v. with
mean 1000 and variance 2500, find the probability that a resistor
picked at random will be rejected.
Solution: Let A be the event that a resistor is rejected. Then
A = {X < 900) u {X > 1100). Since (X < 900) n{X > 1100) = Φ, we have
56
Location & scale parameters and
the gamma densities
57
6. Gamma Distribution
An important application of the scale parameter arises with
the basic gamma density with parameter p > 0. This density is
given by
in a seven-digit sequence?
(b) What is the probability that at least three 1s will occur in a
seven-digit sequence?
2. A noisy transmission channel has a per-digit error
probability p = 0.03.
(a) Calculate the probability of more than one error in 10
received digits.
(b) Repeat (a), using the Poisson approximation
3. It is known that the floppy disks produced by company A
will be defective with probability 0.01. The company sells the
disks in packages of 10 and offers a guarantee of replacement
if more than 1 of the 10 disks is defective. What proportion of
packages is returned? If someone buys three packages, what is
60
the probability that exactly one of them will be returned?
4. The radial miss distance [in meters (m)] of the landing point of