Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Multivariate Distributions: Why Random Vectors?

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Multivariate Distributions

Aria Nosratinia Probability and Statistics 6-1


Why Random Vectors?
Going from two to multiple variables is a logical step.
Often systems have multiple random parameters.
Another application of random vectors is modeling random signals
in discrete time.
Consider the signal X(t) sampled at times t = 1, 2, . . . , n:
_
X
1
, X
2
, . . . , X
n

This can be considered a random vector.


Multivariate analysis is the right tool for these situations.
Aria Nosratinia Probability and Statistics 6-2
Multivariate CDF, PMF, PDF
Denition: The joint CDF of X
1
, X
2
, . . . , X
n
is
F
X
1
,...,X
n
(x
1
, . . . , x
n
) = P
_
X
1
x
1
, . . . , X
n
x
n
_
Denition: The joint PMF of discrete variables X
1
, X
2
, . . . , X
n
is
P
X
1
,...,X
n
(x
1
, . . . , x
n
) = P
_
X
1
= x
1
, . . . , X
n
= x
n
_
Denition: The joint PDF of continuous variables X
1
, X
2
, . . . , X
n
is
f
X
1
,...,X
n
(x
1
, . . . , x
n
) =

n
F
X
1
,...,X
n
(x
1
, . . . , x
n
)
x
1
x
n
Aria Nosratinia Probability and Statistics 6-3
Properties of Discrete Distributions
P
X
1
,...,X
n
(x
1
, . . . , x
n
) 0

x
1

x
n
P
X
1
,...,X
n
(x
1
, . . . , x
n
) = 1
Event Probability:
P(A) =

(x
1
,...,x
n
)A
P
X
1
,...,X
n
(x
1
, . . . , x
n
)
Aria Nosratinia Probability and Statistics 6-4
Properties of Continuous Distributions
f
X
1
,...,X
n
(x
1
, . . . , x
n
) 0
F
X
1
,...,X
n
(x
1
, . . . , x
n
) =
_
x
1


_
x
n

f
X
1
,...,X
n
(u
1
, . . . , u
n
) du
n
du
1
_

f
X
1
,...,X
n
(u
1
, . . . , u
n
) du
n
du
1
= 1
Event Probabilities:
P(A) =
_

_
A
f
X
1
,...,X
n
(x
1
, . . . , x
n
) dx
n
dx
1
Aria Nosratinia Probability and Statistics 6-5
Example
Consider the random vector (X, Y, Z) uniformly distributed over a unit
cube whose x, y, z coordinates are between 0 and 1. Find the
probability of the event X +Y +Z 1.
Aria Nosratinia Probability and Statistics 6-6
Vector Notation
To make life easier, we use boldface letters to show vectors.
Vectors of random variables are shown thus:
X = [X
1
. . . X
n
]
Vectors of real values are shown with lower-case bold letters:
x = [x
1
. . . x
n
]
So we can write the CDF, PMF, and PDF as:
F
X
(x) P
X
(x) f
X
(x)
Aria Nosratinia Probability and Statistics 6-7
Pairs of Random Vectors
Example: It is useful when we wish to analyze the joint
distributions of two signals.
X
1
,X
2
,...,X
N
Y
1
,Y
2
,...,Y
N
h(t)
Joint CDF, PMF, PDF:
P
X,Y
(x, y) = P
X
1
,...,X
n
,Y
1
,...,Y
n
(x
1
, . . . , x
n
, y
1
, . . . , y
n
)
F
X,Y
(x, y) = F
X
1
,...,X
n
,Y
1
,...,Y
n
(x
1
, . . . , x
n
, y
1
, . . . , y
n
)
f
X,Y
(x, y) = f
X
1
,...,X
n
,Y
1
,...,Y
n
(x
1
, . . . , x
n
, y
1
, . . . , y
n
)
In other words, a pair of vectors acts like a long vector.
Aria Nosratinia Probability and Statistics 6-8
Marginal Probability Functions
Just like the previous case, the rule is simple:
For marginal PMF (PDF), take sum (integral) over unwanted
variable(s).
P
X
(x) =

z
P
X,Y,Z
(x, y, z) P
X,Y
(x, y) =

z
P
X,Y,Z
(x, y, z)
f
X
(x) =

f
X,Y,Z
(x, y, z) dydz f
X,Y
(x, y) =

f
X,Y,Z
(x, y, z) dz
For marginal CDF, take a limit over unwanted variable(s).
F
X
(x) = lim
y
lim
z
F
X,Y,Z
(x, y, z) F
X,Y
(x, y) = lim
z
F
X,Y,Z
(x, y, z)
Aria Nosratinia Probability and Statistics 6-9
Example
Consider the probability distribution:
f
X
(x) =
_
_
_
6 exp(a
t
x) x 0
0 else
where we have a = [1 2 3]
t
.
1. What is the joint distribution of X
1
, X
3
?
2. What is the distribution of X
1
?
Aria Nosratinia Probability and Statistics 6-10
Independence
Random variables X
1
, . . . , X
n
are independent if for all x
1
, . . . , x
n
P
X
1
,...,X
n
(x
1
, . . . , x
n
) = P
X
1
(x
1
) P
X
n
(x
n
) discrete
f
X
1
,...,X
n
(x
1
, . . . , x
n
) = f
X
1
(x
1
) f
X
n
(x
n
) continuous
Example 1: Are X
1
, X
2
, X
3
independent if they are uniformly
distributed over the unit cube [0, 1] [0, 1] [0, 1]?
Example 2: Are X
1
, X
2
, X
3
independent if they are uniformly
distributed over the area dened by X 0 and
X
1
+X
2
+X
3
1?
Aria Nosratinia Probability and Statistics 6-11
More on Independence
Independent, identically distributed (iid) X
1
, . . . , X
n
are i.i.d.
if the are independent with identical marginal distributions.
P
X
(x
1
, . . . , x
n
) =
n

i=1
P(x
i
)
f
X
(x
1
, . . . , x
n
) =
n

i=1
f(x
i
)
Independence of Two Random Vectors: Random vectors X, Y
are independent if for all x, y
P
X,Y
(x, y) = P
X
(x) P
Y
(y) discrete
f
X,Y
(x, y) = f
X
(x) f
Y
(y) continuous
NOTE: In this case the components of X and Y may not be
independent among themselves.
Aria Nosratinia Probability and Statistics 6-12
Functions of Random Vectors
We start with a scalar function of a vector:
W = g(X
1
, . . . , X
n
) = g(X)
Then the distribution of W is given as follows:
P
W
(w) =

g(x)=w
P
X
(x) discrete
F
W
(w) =
_

_
g(x)w
f
X
(x)dx continuous
Aria Nosratinia Probability and Statistics 6-13
Example
The random vector X = [X
1
X
2
X
3
]
t
is distributed uniformly over
the area dened by X 0 and [1 1 1]X 1. Find the distribution of
W = X
1
+X
2
.
Aria Nosratinia Probability and Statistics 6-14
Example
We draw n numbers uniformly distributed between [0, 1]. What is the
distribution of the maximum of these numbers? The minimum?
Aria Nosratinia Probability and Statistics 6-15
Maximum and Minimum of IID Variables
Consider n i.i.d. variables with PDF f
X
(x) and CDF F
X
(x) then
1. The PDF and CDF of Y = max{X
1
, . . . , X
n
} is
F
Y
(y) =
_
F
X
(y)
_
n
f
Y
(y) = n
_
F
X
(y)
_
n1
f
X
(y)
2. The PDF and CDF of W = min{X
1
, . . . , X
n
} is
F
W
(w) = 1
_
1F
X
(w)
_
n
f
W
(w) = n
_
1F
X
(w)
_
n1
f
X
(w)
Aria Nosratinia Probability and Statistics 6-16
General Formula for Functions of RV
Consider a 1-to-1 vector valued function Y = g(X).
A
B
Y=g(X)
Then the dierential probabilities must be the same. Thus:
f
X
(x) vol(A) = f
Y
(y) vol(B)
The ratio of vol(B) to vol(A) is described by the Jacobian, therefore,
f
Y
(y) =
1
|J(x)|
f
X
_
g
1
(y)
_
Aria Nosratinia Probability and Statistics 6-17
Jacobian
The Jacobian is dened as:
J =
_

_
Y
1
X
1
Y
2
X
1

Y
n
X
1
Y
1
X
2
Y
2
X
2

Y
n
X
2
.
.
.
.
.
.
Y
1
X
n
Y
2
X
n

Y
n
X
n
_

_
In the formula, we need to use the absolute value of the determinant
of J, and then substitute x = g
1
(y), since we need (eventually) a
function of y.
NOTE: This is arrived at from the change-of-variables in multiple
integratals. Consult your advanced calculus for details.
Aria Nosratinia Probability and Statistics 6-18
Linear Vector Functions
Theorem: If X is a continuous random vector and A is an invertible
matrix, then Y = AX+b has distribution:
f
Y
(y) =
1
|det(A)|
f
X
_
A
1
(y b)
_
Proof: Use the Jacobian.
Aria Nosratinia Probability and Statistics 6-19
Example
Consider the probability distribution:
f
X
(x) =
_
_
_
6 exp(a
t
x) x 0
0 else
where a = [1 2 3]
t
. Now consider Y = AX with A being invertible.
What is the pdf of Y?
Aria Nosratinia Probability and Statistics 6-20
Expected Value
Denition: Expected value of a vector X is a scalar vector:
E[X] =
X
=
_

_
E[X
1
]
.
.
.
E[X
n
]
_

_
We can similarly dene the expected value of a random matrix, by
taking expected value component-wise.
Aria Nosratinia Probability and Statistics 6-21
Cross Correlation and Cross Covariance
Denition: The cross-correlation of X, Y is a matrix R
XY
whose
i, j-th element is R
XY
(i, j) = E[X
i
Y
j
], in other words:
R
XY
= E
_
XY
t

Denition: Cross-covariance of random vectors X, Y is a matrix


whose i, j-th element is C
XY
(i, j) = Cov(X
i
, Y
j
), in other words,
C
XY
= E
_
(X
X
)(Y
Y
)
t
_
Aria Nosratinia Probability and Statistics 6-22
Correlation and Covariance
Denition: The correlation R
X
is a matrix whose i, j-th element is
R
X
(i, j) = E[X
i
X
j
], in other words:
R
X
= E
_
XX
t

Denition: Covariance of random vector X is a matrix whose i, j-th


element is C
X
(i, j) = Cov(X
i
, X
j
), in other words,
C
X
= E
_
(X
X
)(X
X
)
t
_
Fact: Correlation and covariance are related thus:
C
X
= R
X

X

t
X
Aria Nosratinia Probability and Statistics 6-23
Linear Relationships
If X has mean
X
, correlation R
X
, and covariance C
X
, and if
Y = AX+b, then:

Y
= A
X
+b
R
Y
= AR
X
A
t
+ (A
X
)b
t
+b(A
X
)
t
+bb
t
C
Y
= AC
X
A
t
and
R
XY
= R
X
A
t
+
X
b
t
C
XY
= C
X
A
t
Aria Nosratinia Probability and Statistics 6-24
Gaussian Random Vectors
Denition: X is Gaussian if
f
X
(x) =
1
(2)
n
2
det(C
X
)
1
2
exp
_

1
2
(x
X
)
t
C
1
X
(x
X
)
_
This distribution has mean
X
and covariance C
X
.
Aria Nosratinia Probability and Statistics 6-25
Gaussian Properties
Theorem: Gaussian X has independent components i C
X
is
diagonal. I.e., for Gaussians, uncorrelated = independent.
Proof: This is a direct consequence of the exponential in the Gaussian
formula:
C
1
X
= diag(
2
1
, . . . ,
2
n
)
(x
X
)
t
C
1
X
(x
X
) =
n

i=1
(x
i

i
)
2

2
i
f
X
(x) =
1
(2)
n
2
det(C
X
)
1
2
exp
_
n

i=1
(x
i

i
)
2

2
i
_
=

1
_
2
2
i
exp
_

(x
i

i
)
2
2
2
i
_
Aria Nosratinia Probability and Statistics 6-26
Gaussian Properties
Theorem: Sum of two (multiple) Gaussian variables is a Gaussian
variable.
This is a special case of the following:
Theorem: If X is a Gaussian vector with mean
x
and covariance
C
X
, then
Y = AX+b
is also a Gaussian variable with
Y
= A
X
+b and C
Y
= AC
X
A
t
.
Proof: Basic idea is to write the CDF and show that it is Gaussian
with the proscribed parameters.
Aria Nosratinia Probability and Statistics 6-27
Standard Normal Vector
Denition: Z is a standard normal vector if it is Gaussian with

Z
= 0 and C
Z
= I.
Theorem: For a Gaussian (
X
, C
X
) random variable, let A be the
square root of C
X
, i.e., C
X
= AA
t
, then
Z = A
1
(X
X
)
is a standard normal vector.
Proof: Use the previous result on the relation of linearly related
Gaussian vectors.
Fact: It is always possible to nd A because C
X
is positive
semi-denite.
Aria Nosratinia Probability and Statistics 6-28

You might also like