Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
134 views

Multivariate Methods Assignment Help

The document provides solutions to exercises involving multivariate normal distributions and multivariate analysis. It includes steps to find the density function, contours, and independence of random variables for bivariate and multivariate normal distributions. It also provides maximum likelihood estimates, distributions of sample statistics, and tests of normality. Solutions involve calculating common multivariate statistics like the mean vector, covariance matrix, and calculating generalized distances.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
134 views

Multivariate Methods Assignment Help

The document provides solutions to exercises involving multivariate normal distributions and multivariate analysis. It includes steps to find the density function, contours, and independence of random variables for bivariate and multivariate normal distributions. It also provides maximum likelihood estimates, distributions of sample statistics, and tests of normality. Solutions involve calculating common multivariate statistics like the mean vector, covariance matrix, and calculating generalized distances.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Multivariate Methods Assignment Help

For any Assignment related queries, call us at : - +1 678 648 4277


visit : - https://www.statisticsassignmentexperts.com/, or
Email : - info@statisticsassignmentexperts.com
1. Consider a bivariate normal population with µ1 = 0, µ2 = 2, σ11 = 2, σ22 = 1, and
ρ12 = 0.5.
(a) Write out the bivariate normal density.
(b) (b) Write out the squared generalized distance expression (x − µ) T Σ−1 (x − µ)
as a function of x1 and x2.
(c) (c) Determine (and sketch) the constant-density contour that contains 50% of
the probability.

Sol. (a) The multivariate normal density is defined by the following equation.

statisticsassignmentexperts.com
(c) For α = 0.5, the solid ellipsoid of (x1, x2) satisfy (x − µ) T Σ−1 (x − µ) ≤ χ 2 p,α = c2
will have probability 50%. From the quantile function in R we have χ 22,0.5 =
qchisq(0.5,df=2) = 1.3863, therefore, c = 1.1774. The eigenvalues of Σ are (λ1, λ2) =
(2.3660, 0.6340) with eigenvectors (e1 e2) =( −0.8881 0.4597 )
( −0.4597 −0.8881).
Therefore, we have the axes as: c √ λ1 = 1.8111 and c √ λ2 = 0.9375. The contour is
plotted in Figure 1.

Figure 1: Contour that contains 50% of the


probability

statisticsassignmentexperts.com
2. Let X be N3(µ, Σ) with µ T = (2, −3, 1) and Σ =

(a) Find the distribution of 3X1 − 2X2 + X3.


(b) Relabel the variables if necessary, and find a 2 × 1 vector a such that X2 and
X2 − a T (X1 / X3) are independent.

Sol. (a) Let a = (3, −2, 1)T , then aTX = 3X1 − 2X2 + X3. Therefore,
aTX ∼ N(aTµ, aT Σa),
where

statisticsassignmentexperts.com
Since we want to have X2 and Y independent, this implies that −a1 − 2a2 + 3 = 0.
So we have vector
, for c ∈ R

3. Let X be distributed as N3(µ, Σ), where µT = (1, −1, 2) and Σ =

Which of the following random variables are independent? Explain.

(a) X1 and X2
(b) X1 and X3
(c) X2 and X3
(d) (X1, X3) and X2
(e) X1 and X1 + 3X2 − 2X3

Sol. (a) σ12 = σ21 = 0, X1 and X2 are independent.


(b) σ13 = σ31 = −1, X1 and X3 statisticsassignmentexperts.com
are not independent.
(c) σ23 = σ32 = 0, X2 and X3 are independent.
(d) We rearrange the covariance matrix and partition it. The new covariance
matrix is as following:

It is clear that (X1, X3) and X2 are independent.

It is clear that X1 and X1 + 3X2 − 2X3 are not independent.

4. Refer to Exercise 3 and specify each of the following.


(a) The conditional distribution of X1, given that X3 = x3.
(b) The conditional distribution of X1, given that X2 = x2 and X3 = x3.
statisticsassignmentexperts.com
Sol. We use the result 3.

Let X = (X1/X2) ∼ N(µ, Σ) with µ = (µ1/µ2)

and Σ = and |Σ22| > 0. Then

(a) X1 | X2 = x2 ∼ N(µ1 + Σ12Σ−122 (x2 − µ2), Σ11 − Σ12Σ−122 Σ21)


X1 | X3 = x3 ∼ N1 + (−1)(2)−1 (x3 − 2), 4 − (−1)(2)−1 (−1)
X1 | X3 = x3 ∼ N(-1/2x3+2,)

(b) X1 | X2 = x2, X3 = x3

⇒ X1 | X2 = x2, X3 = x3 ∼ N(− 1/2x3 + 2,)

5. Let X1, X2, X3, and X4 be independent Np(µ, Σ) random vectors.

(b) Find the marginal distributions for each of the random vectors

statisticsassignmentexperts.com
and

(b) Find the joint density of the random vectors V1 and V2 defined in (a).

Sol. (a) By result 4.8, V1 and V2 have the following distribution

Then we have V1 ∼ Np(0, 1/4Σ) and V2 ∼ Np(0, 14Σ). (b) Also by result 4.8, V1 and
V2 are jointly multivariate normal with covariance matrix

, with c =

So that we have the joint distribution of V1 and V2 as following:

statisticsassignmentexperts.com
6. Find the maximum likelihood estimates of the 2×1 mean vector µ and the 2×2
covariance matrix Σ based on the random sample from a bivariate normal
population.

Sol. Since the random samples X1, X2, X3, and X4 are from normal population,
the maximum likelihood estimates of µ and Σ are X¯ and 1/n Σ ni=1(Xi − X¯ )(Xi −
X¯ )T . Therefore,

7. Let X1, X2, . . . , X20 be a random sample of size n = 20 from an N6(µ, Σ)


population. Specify each of the following completely.

(a) The distribution of (X1 − µ)T Σ−1 (X1 − µ)


(b) The distributions of X¯ and √ n(X¯ − µ)
(c) The distribution of (n − 1)S

Sol. (a) (X1 − µ)T Σ−1 (X1 − µ) is distributed as χ26


statisticsassignmentexperts.com
(b) X¯ is distributed as N6(µ, 1/20Σ) and √ n(X¯ − µ) is distributed as N6(0, Σ)
(c) (n − 1)S is distributed as Wishart distribution Σ20−1i=1 ZiZTi , where Zi ∼ N6(0,
Σ). We write this as W6(19, Σ), i.e., Wishart distribution with dimensionality 6,
degrees of freedom 19, and covariance matrix Σ.

8. Let X1, . . . , X60 be a random sample of size 60 from a four-variate normal


distribution having mean µ and covariance Σ. Specify each of the following
completely.

(a) The distribution of X¯


(b) The distribution of (X1 − µ)TΣ−1(X1 − µ)
(c) The distribution of n(X¯ − µ)TΣ−1(X¯ − µ)
(d) The approximate distribution of n(X¯− µ)TS−1 (X¯− µ)

Sol. (a) X¯ is distributed as N4(µ, 1/60Σ).


(b) (X1 − µ)TΣ−1(X1 − µ) is distributed as χ24 .
(c) n(X¯ − µ)TΣ−1(X¯ − µ) is distributed as χ24.
(d) Since 60 4, n(X¯ − µ)TS−1(X¯ − µ) can be approximated as χ24.

9. Consider the annual rates of return (including dividends) on the Dow-Jones


industrial average for the years 1996-2005. These data, multiplied by 100, are
−0.6 3.1 statisticsassignmentexperts.com
25.3 −16.8 −7.1 −6.2 25.2 22.6 26.0
Use these 10 observations to complete the following.

(a) Construct a Q-Q plot. Do the data seem to be normally distributed? Explain.
(b) Carry out a test of normality based on the correlation coefficient r Q. Let the
significance level be α = 0.1.

Sol. (a) The Q-Q plot of this data is plotted in Figure 2. It seems that all the
sample quantiles are close the theoretical quantiles. However, the Q-Q plots
are not particularly informative unless the sample size is moderate to large,
for instance, n ≥ 20. There can be quite a bit of variability in the straightness
of the Q-Q plot for small samples, even when the observations are known to
come from a normal population.

Figure 2: Normal Q-Q plot

(b) From (4-31) in the textbook, the qQ is defined by

statisticsassignmentexperts.com
Using the information from the data, we have rQ = 0.9351. The R code of this
calculation is compiled in Appendix. From Table 4.2 in the textbook we know that
the critical point to test of normality at the 10% level of significance
corresponding to n = 9 and α = 0.1 is between 0.9032 and 0.9351. Since rQ =
0.9351 > the critical point, we do not reject the hypothesis of normality.

10. Exercise 1.2 gives the age x1, measured in years, as well as the selling price x 2,
measured in thousands of dollars, for n = 10 used cars. These data are reproduced
as follows:

(a) Use the results of Exercise 1.2 to calculate the squared statistical distances (x j
− x¯)TS−1 (xj − x¯), j = 1, 2, . . . , 10, where xTj = (xj1, xj2).
(b) Using the distances in Part (a), determine the proportion of the observations
falling within the estimated 50% probability contour of a bivariate normal
distribution.
(c) Order the distances in Part (a) and construct a chi-square plot.
(d) Given the results in Parts (b) and (c), are these data approximately bivariate
normal? Explain.
statisticsassignmentexperts.com
Sol. (a) From Exercise 1.2 we have x¯=

The squared statistical distances d2j= (xj − x¯)TS−1(xj − x¯), j = 1, . . . , 10 are


calculated and listed below

(b) We plot the data points and 50% probability contour (the blue ellipse) in
Figure 3. It is clear that subject 4, 5, 6, 8, and 9 are falling within the
estimated 50% probability contour.
The proportion of that is 0.5.

Figure 3: Contour of a bivariate normal

statisticsassignmentexperts.com
(c) The squared distances in Part (a) are ordered as below. The chi-square plot is
shown in Figure 4.

Figure 4: Chi-square plot

(d) Given the results in Parts (b) and (c), we conclude these data are
approximately bivariate normal. Most of the data are around the theoretical line.
statisticsassignmentexperts.com
Appendix

R code for Problem 1. (c).

> library(ellipse)
library(MASS)
> library(mvtnorm)
> set.seed(123)
>
> mu <- c(0,2)
> Sigma <- matrix(c(2,sqrt(2)/2,sqrt(2)/2,1), nrow=2, ncol=2)
> X <- mvrnorm(n=10000,mu=mu, Sigma=Sigma) > lambda <- eigen(Sigma)
$values
> Gamma <- eigen(Sigma)$vectors
> elps <- t(t(ellipse(Sigma, level=0.5, npoints=1000))+mu)
> chi <- qchisq(0.5,df=2)
> c <- sqrt(chi)
> factor <- c*sqrt(lambda)
> plot(X[,1],X[,2])
> lines(elps)
> points(mu[1], mu[2])
> segments(mu[1],mu[2],factor[1]*Gamma[1,1],factor[1]*Gamma[2,1]+mu[2])
> segments(mu[1],mu[2],factor[2]*Gamma[1,2],factor[2]*Gamma[2,2]+mu[2])
statisticsassignmentexperts.com
R code for Problem 9.

> x <- c(-0.6, 3.1, 25.3, -16.8, -7.1, -6.2, 25.2, 22.6, 26.0)
> # (a) > qqnorm(x)
> qqline(x)
> # (b)
> y <- sort(x)
> n <- length(y)
> p <- (1:n)-0.5)/n
>q <- qnorm(p)
> rQ <- cor(y,q)

R code for Problem 10.

> n <- 10 > x1 <- c(1,2,3,3,4,5,6,8,9,11)


> x2 <- c(18.95, 19.00, 17.95, 15.54, 14.00, 12.95, 8.94, 7.49, 6.00, 3.99)
> X <- cbind(x1,x2)
> Xbar <- colMeans(X)
> S <- cov(X)
> Sinv <- solve(S)
>
> # (a)
> d <- diag(t(t(X)-Xbar)%*%Sinv%*%(t(X)-Xbar))
statisticsassignmentexperts.com
>
> # (b)
> library(ellipse)
> p <- 2
> elps <- t(t(ellipse(S, level=0.85, npoints=1000))+Xbar)
> plot(X[,1],X[,2],type="n")
> index <- d < qchisq(0.5,df=p)
> text(X[,1][index],X[,2][index],(1:n)[index],col="blue")
> text(X[,1][!index],X[,2][!index],(1:n)[!index],col="red")
> lines(elps,col="blue")
>
> # (c)
> names(d) <- 1:10
> sort(d)
> qqplot(qchisq(ppoints(500),df=p), d, main="", + xlab="Theoretical Quantiles",
ylab="Sample Quantiles")
> qqline(d,distribution=function(x){qchisq(x,df=p)})

statisticsassignmentexperts.com

You might also like