Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Proba Lectures

Download as pdf or txt
Download as pdf or txt
You are on page 1of 29

Uniform distribution. Probability density functions. Random variables.

Independence

Uniform distribution.
Probability density functions.
Random variables. Independence

Dr.Stephen Edward Mwaijande

FTC (MUST), B.Sc.Ed (Hons)(UDSM),M.Sc


(NM-AIST),Ph.D(UDSM)

Department of Mathematics and Statistics,


The University of Dodoma

1/24
Uniform distribution. Probability density functions. Random variables. Independence

After developing a background in probabilistic models with


discrete outcomes we can now progress further and do
exercises where uncountably many outcomes are explicitly
involved.
Here, the events are associated with subsets of a continuous
space (a real line R, an interval (a, b), a plane R2 , a square,
etc.). The simplest case is where the outcome space Ω is
represented by a ’nice’ bounded set and the probability
distribution corresponds to a unit mass uniformly spread over
it. Then the event (i.e a subset) A ⊂ Ω acquires the
probability
ν(A)
P (A) = . (1)
ν(Ω)
where ν(A) is the standard Euclidean volume (or area or 2/24
Uniform distribution. Probability density functions. Random variables. Independence

Z
P (A) = f (x)dx. (2)
A
Here f is a given PDF f ≥ 0 with
Z
f (x)dx = 1. (3)

3/24
Uniform distribution. Probability density functions. Random variables. Independence

Uniform Distribution

A continuous random variable X is said to have a Uniform


distribution over the interval [a, b] , shown as
X ∼ U nif orm(a, b), if its PDF is given by

1

 b−a , if a < x < b,
fX (x) = (4)
0, if x < a, or x ≥ b.

4/24
Uniform distribution. Probability density functions. Random variables. Independence

It can be noted that the corresponding CDF is given below:





 0, if y ≤ a ≤ 1,



FX (y) = y−a
b−a
, if a < y < b, (5)




y, if y ≥ b.

its mean and variance are given respectively by


a+b
E(X) = , (6)
2
(b − a)2
V ar(X) = . (7)
12

5/24
Uniform distribution. Probability density functions. Random variables. Independence

Normal (Gaussian) Distribution


The normal distribution is by far the most important
probability distribution. One of the main reasons for that is
the Central Limit Theorem (CLT) that we will discuss later in
the book. To give you an idea, the CLT states that if you add
a large number of random variables, the distribution of the
sum will be approximately normal under certain conditions.
The importance of this result comes from the fact that many
random variables in real life can be expressed as the sum of a
large number of randomvariables and, by the CLT, we can
argue that distribution of the sum should be normal. The CLT
is one of the most important results in probability and we will
6/24
Uniform distribution. Probability density functions. Random variables. Independence

We first define the standard normal random variable. The


PDF of an N (µ, σ 2 ) RV X is
 
1 1 2
p exp − 2 (x − µ) , x ∈ R; (8)
(2πσ) 2σ
with the mean and variance E(X) = µ, V ar(X) = σ 2
and the MGF and CHF
1 2 2 1 2 2
EeθX = eθµ+ 2 θ σ , EeitX = eitµ− 2 t σ , θ, t ∈ R If
X ∼ N (µ, σ 2 ), then x−µ
σ
∼ N (0, 1) and
∀b, c ∈ :cX + b ∼ N (cµ + b, c2 δ 2 ).

7/24
Uniform distribution. Probability density functions. Random variables. Independence

also Gausian distribution has a CDF:


Z ∞  
1 1 2 y−µ
FX (y) = p exp − 2 (x − µ) dx = Φ( ), y ∈ R;
σ (2π) ∞ 2σ σ
(9)

8/24
Uniform distribution. Probability density functions. Random variables. Independence

We will then see that we can obtain other normal random


variables by scaling and shifting a standard normal random
variable.
A continuous random variable Z is said to be a standard
normal (standard Gaussian) random variable, shown as
Z ∼ N (0, 1) , if its PDF is given by

 
1 1 2
FZ (z) = p exp − z , z ∈ R; (10)
(2π) 2

The √ 1 is there to make sure that the area under the PDF
(2π)
is equal to one. 9/24
Uniform distribution. Probability density functions. Random variables. Independence

Note that, in all calculations involving PDFs, the sets C with


Z
dx = 0. (11)
C

(sets of measure 0) can be disregarded. Therefore,


probabilities P (a ≤ x ≤ b) and P (a < x < b) coincide. (This
is, of course, not true for discrete RVs.)
The median m(X) of RV X gives the value that ’divides’ the
range of X into two pieces of equal mass. In terms of the CDF
and PDF:
m(X) = max {y : FX (y) ≥ 0} . (12)

9/24
Uniform distribution. Probability density functions. Random variables. Independence

10/24
Uniform distribution. Probability density functions. Random variables. Independence

Exponential Distribution

The exponential distribution is one of the widely used


continuous distributions. It is often used to model the time
elapsed between events. We will now mathematically define
the exponential distribution, and derive its mean and expected
value. Then we will develop the intuition for the distribution
and discuss several interesting properties that it has.

11/24
Uniform distribution. Probability density functions. Random variables. Independence

A continuous random variable X is said to have an


exponential distribution with parameter λ > 0 , shown as
X ∼ Exponential(λ), if its PDF is given by

−λx
λe , x > 0;


fX (x) = (13)


 0, otherwise

and its CDF is given by



0, if x ≤ 0,


FX (x) = (14)
−λx
1 − e , x > 0;

12/24
Uniform distribution. Probability density functions. Random variables. Independence

We can find its expected value as follows, using integration by


parts:
Z ∞ Z ∞
1
E(X) = xf (x)dx = xλe−λx dx =
0 0 λ
and its variance is
1
V ar(X) =
λ2

13/24
Uniform distribution. Probability density functions. Random variables. Independence

Gamma Distribution

The gamma distribution is another widely used distribution.


Its importance is largely due to its relation to exponential and
normal distributions. Here, we will provide an introduction to
the gamma distribution. Interested person may discuss more
properties of the gamma random variables. Before introducing
the gamma random variable, we need to introduce the gamma
function.

14/24
Uniform distribution. Probability density functions. Random variables. Independence

Gamma function: The gamma function shown by Γ(x), is an


extension of the factorial function to real (and complex)
numbers. Specifically, if n = {1, 2, 3, ..., }, then

Γ(n) = (n − 1)! (15)

More generally, for any positive real number α , Γ(α) is


defined as

Z ∞
Γ(α) = xα−1 e−λx dx, for α > 0. (16) 15/24
Uniform distribution. Probability density functions. Random variables. Independence

for a Gamma RV, has a PDF


λα α−1 −λx
fX (x) = x e ; (17)
Γ(α)

and a CDF y
λα
Z
FX (x) = xα−1 e−λx ; (18)
Γ(α) 0

15/24
Uniform distribution. Probability density functions. Random variables. Independence

For integrals over (0, ∞) with an integrand consisting of a


power term and an exponential term, one should try
transforming into a Gamma function, defined as
Z ∞
Γ(p) = xα−1 e−λx dx. (19)
0
Given p, Γ(p) can be found from tables or by means of a
computer program. Some useful properties of the Gamma
function are:
Γ(p + 1) = pΓ(p),
Γ(n + 1) = n!, (20)
1 √
Γ( ) = π.
2
for non-negative integer n. 16/24
Uniform distribution. Probability density functions. Random variables. Independence

CHECK FOR GAMMA DISTRIBUTION:


α
E(X) =
λ
and its variance is
α
V ar(X) =
λ2

17/24
Uniform distribution. Probability density functions. Random variables. Independence

Cauchy Distribution

For a Cauchy RV has a PDF


τ
fX (y) = , y ∈ R. (21)
τ2 + (y − α)2

and a CDF
 
1 −1 y − α π
FX (y) = tan ( )+ , y ∈ R. (22)
π τ 2

To be brief we write X ∼ Ca(α, τ ).

18/24
Uniform distribution. Probability density functions. Random variables. Independence

In general, we say that X has a PDF f (and write X ∼ f ) if


∀y ∈ R
Z y
Gam(p) = tp−1 e−t dt, p > 0. (23)
−∞

Then, ∀a, b ∈ R with a < b:


Z b
P (a < X < b) = f (x)dx. (24)
a
and in general, ∀ measurable set A ⊂ R:
Z
P (X ∈ A) = f (x)dx. (25)
A

19/24
Uniform distribution. Probability density functions. Random variables. Independence

Expectation
The expected value or expectation of a continuous r.v. X with
p.d.f. f(x) is denoted by E(X) and is defined as
Z ∞
E(X) = xf (x)dx. (26)
−∞

provided that the integral is absolutely convergent (i.e.


R∞
−∞
|x|f (x)dx is finite). As in the discrete case, E(X) is
often termed the expected value or mean of X.
Z ∞
E(g(X)) = g(x)f (x)dx. (27)
−∞
20/24
Uniform distribution. Probability density functions. Random variables. Independence

Variance

An immediate application is to the variance of X, denoted by


V ar(X) and defined as

V ar(X) = E([X − E(X)]2 ). (28)

21/24
Uniform distribution. Probability density functions. Random variables. Independence

Writing µ = E(X), we have


Z ∞
V ar(X) = (x − µ)2 f (x)dx,
Z−∞

= (x2 − 2µx + µ2 )f (x)dx,
Z−∞
∞ Z ∞ Z ∞
2 2
= x f (x)dx − 2µ xf (x)dx + µ f (x)dx,
−∞ −∞ −∞
Z ∞ Z ∞
2 2
= x f (x)dx − 2µ.µ + µ .1 = x2 f (x)dx − µ2 ,
−∞ −∞
2 2
= E(X ) − [E(X)] ,
(29)

just as in the discrete case.


22/24
Uniform distribution. Probability density functions. Random variables. Independence

Normal distributions. Convergence of random


variables and distributions.The Central Limit
Theorem
I We have already learned a number of properties of a
normal distribution. Its importance was realised at an
early stage by, among others, Laplace, Poisson and of
course Gauss.
I Understanding the special nature of normal distributions
required facts and methods from other fields of
mathematics, including analysis and mathematical physics
(notably, complex analysis and partial differential
23/24
Uniform distribution. Probability density functions. Random variables. Independence

Recall the properties of Gaussian distributions which we have


established so far:
I The PDF of an N (µ, σ 2 ) RV X is
 
1 1 2
p exp − 2 (x − µ) , x ∈ R; (30)
(2πσ) 2δ
with the mean and variance E(X) = µ, V ar(X) = σ 2
and the MGF and CHF
1 2 2 1 2 2
EeθX = eθµ+ 2 θ σ , EeitX = eitµ− 2 t σ , θ, t ∈ R If
X ∼ N (µ, σ 2 ), then x−µ
σ
∼ N (0, 1) and
∀b, c ∈ :cX + b ∼ N (cµ + b, c2 δ 2 ).
I Two jointly normal RVs X and Y are independent iff
Cov(X, Y ) = Corr(X, Y ) = 0.
24/24
Uniform distribution. Probability density functions. Random variables. Independence

The sum X +Y of two jointly normal RVs X ∼ N (µ1 , σ12 ),


Y ∼ N (µ2 , σ22 ), with CorrX − Y = r is normal, with mean
µ1 + µ2 and the variance σ12 + 2σ1 σ2 + σ22 . See equation
(2.41). In particular, if X, Y are independent,
X + Y ∼ N (µ1 + µ2 , σ12 + σ22 ). In general, for independent
RVs X1 , X2 , ..., where Xi ∼ N (µi , σi2 ), the linear combination
Σi ci Xi ∼ N (Σi ci µi , Σi c2i σi2 ).
To sum up for any random variable X we have discussed so
far, we write X ∼ U (a, b), X ∼ N (µ, σ 2 ), X ∼ Exp(λ),
X ∼ Gam(α, λ),and X ∼ Ca(α, τ ).

25/24
Uniform distribution. Probability density functions. Random variables. Independence

Example-Exercise
Let X ∼ Ca(2, 4). Find the probability that
a. X is less than 3,
b. X is greater than 4,
c. X is between 1 and 3.5
Solution
A CDF for Cauchy is given by
 
1 −1 y − α π
FX (y) = tan ( )+ , y ∈ R. (31)
π τ 2

26/24
Uniform distribution. Probability density functions. Random variables. Independence

here α = 2, τ = 4, y = 3
we have

P (X < 3) = F (3),
 
1 −1 3 − 2 π
= tan ( )+ ,
π 4 2
1 h −1 πi
= tan (0.25) + = 0.578
π 2
Verify for part b) and c)
ANS
b)0.3524 c) 0.156
27/24

You might also like