Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
13 views

Arpan Final Assignment EcII

This document contains the final assignment for an econometrics course. It includes 3 questions analyzing 3 different datasets using linear regression models. For question 1, OLS is used to estimate coefficients for a linear model. Question 2 examines plots of variables to assess assumptions. Question 3 estimates several linear models with different variable transformations to test relationships. Question 4 uses ARIMA models to forecast time series data and assess the impact of lagged independent variables in a linear model.

Uploaded by

acharya.arpan08
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Arpan Final Assignment EcII

This document contains the final assignment for an econometrics course. It includes 3 questions analyzing 3 different datasets using linear regression models. For question 1, OLS is used to estimate coefficients for a linear model. Question 2 examines plots of variables to assess assumptions. Question 3 estimates several linear models with different variable transformations to test relationships. Question 4 uses ARIMA models to forecast time series data and assess the impact of lagged independent variables in a linear model.

Uploaded by

acharya.arpan08
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 39

Final Assignment Econometrics II

Arpan Acharya

2023-02-03

Final Assignment
There are three datasets used in this assignment. They are stored in data1, data2 and data3
respectively. Before that, I install and load required packages
library(foreign)
library(moments)
library(readxl)
library(tidyverse)

## ── Attaching packages ─────────────────────────────────────── tidyverse


1.3.2 ──
## ✔ ggplot2 3.4.0 ✔ purrr 0.3.5
## ✔ tibble 3.1.8 ✔ dplyr 1.0.10
## ✔ tidyr 1.2.1 ✔ stringr 1.5.0
## ✔ readr 2.1.3 ✔ forcats 0.5.2
## ── Conflicts ──────────────────────────────────────────
tidyverse_conflicts() ──
## ✖ dplyr::filter() masks stats::filter()
## ✖ dplyr::lag() masks stats::lag()

Reading the given datas


setwd("C:\\Users\\pc\\Downloads")
data1 <- read_excel("Data1.xlsx")
data2 <- read_excel("Data2.xlsx")
data3 <- read_excel("Data3.xlsx")

Question no. 1—-


Initially I create the matrix X and Y asa descried in the question
y <- matrix(c(data1$y), nrow = 29)
x <- matrix (c(1, data1$X1, data1$X2 + data1$X3), nrow = 29)

## Warning in matrix(c(1, data1$X1, data1$X2 + data1$X3), nrow = 29): data


length
## [59] is not a sub-multiple or multiple of the number of rows [29]

Now computing beta matrix as specified in the question


beta <- solve(t(x) %*% x) %*% t(x) %*% y
beta

## [,1]
## [1,] 3.1364865
## [2,] 0.6661833
## [3,] 2.3531720

computing beta through numerical optimization


model <- lm(y~data1$X1+data1$X2+data1$X3)
summary(model)

##
## Call:
## lm(formula = y ~ data1$X1 + data1$X2 + data1$X3)
##
## Residuals:
## Min 1Q Median 3Q Max
## -59.969 -27.346 6.977 31.657 44.523
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 66.5184 23.9905 2.773 0.0104 *
## data1$X1 0.5414 0.7803 0.694 0.4941
## data1$X2 -38.8830 89.9716 -0.432 0.6693
## data1$X3 1.7769 0.8627 2.060 0.0500 *
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 34.9 on 25 degrees of freedom
## Multiple R-squared: 0.6403, Adjusted R-squared: 0.5971
## F-statistic: 14.83 on 3 and 25 DF, p-value: 9.445e-06

Question no. 2—-


plot(data1$X1, data1$y)
Normal
distribution, constant mean and variance.
plot(data1$X2, data1$y)

uniform
distribution, constant mean and variance
plot(data1$X1, data1$y)

Normal
distribution, trending mean and constant variance

Question no. 3—-


model3a <- lm(y ~ X1 + X2 + X3, data = data1)
summary(model3a)

##
## Call:
## lm(formula = y ~ X1 + X2 + X3, data = data1)
##
## Residuals:
## Min 1Q Median 3Q Max
## -59.969 -27.346 6.977 31.657 44.523
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 66.5184 23.9905 2.773 0.0104 *
## X1 0.5414 0.7803 0.694 0.4941
## X2 -38.8830 89.9716 -0.432 0.6693
## X3 1.7769 0.8627 2.060 0.0500 *
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 34.9 on 25 degrees of freedom
## Multiple R-squared: 0.6403, Adjusted R-squared: 0.5971
## F-statistic: 14.83 on 3 and 25 DF, p-value: 9.445e-06

Here, if x increases by 1 unit, y will also increase by 1 unit b)


model3b <- lm(log(y) ~ X1 + X2 + X3, data = data1)
summary(model3b)

##
## Call:
## lm(formula = log(y) ~ X1 + X2 + X3, data = data1)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.4418 -0.2056 0.0966 0.2351 0.7061
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 3.859504 0.315226 12.244 4.66e-12 ***
## X1 0.006653 0.010253 0.649 0.522
## X2 0.926089 1.182193 0.783 0.441
## X3 0.005016 0.011335 0.443 0.662
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.4586 on 25 degrees of freedom
## Multiple R-squared: 0.4939, Adjusted R-squared: 0.4332
## F-statistic: 8.133 on 3 and 25 DF, p-value: 0.0006008

Here, if x increases by 1 unit y increases by 1 percent. c)


model3c <- lm(y ~ log(X1) + log(X2) + log(X3), data = data1)
summary(model3c)

##
## Call:
## lm(formula = y ~ log(X1) + log(X2) + log(X3), data = data1)
##
## Residuals:
## Min 1Q Median 3Q Max
## -83.234 -31.626 2.963 30.230 84.153
##
## Coefficients: (1 not defined because of singularities)
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 132.229 28.060 4.712 7.18e-05 ***
## log(X1) 7.455 10.459 0.713 0.48235
## log(X2) 31.149 8.968 3.473 0.00182 **
## log(X3) NA NA NA NA
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 46.9 on 26 degrees of freedom
## Multiple R-squared: 0.3244, Adjusted R-squared: 0.2724
## F-statistic: 6.241 on 2 and 26 DF, p-value: 0.006114

Here, If x increases by 1 unit, y also increases by 1 unit d)


model3d <- lm(log(y) ~ log(X1) + log(X2) + log(X3), data = data1)
summary(model3d)

##
## Call:
## lm(formula = log(y) ~ log(X1) + log(X2) + log(X3), data = data1)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.4210 -0.1790 0.1593 0.2546 1.1118
##
## Coefficients: (1 not defined because of singularities)
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 4.83300 0.31184 15.498 1.2e-14 ***
## log(X1) 0.05913 0.11624 0.509 0.61525
## log(X2) 0.34585 0.09966 3.470 0.00183 **
## log(X3) NA NA NA NA
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.5212 on 26 degrees of freedom
## Multiple R-squared: 0.3201, Adjusted R-squared: 0.2677
## F-statistic: 6.119 on 2 and 26 DF, p-value: 0.00664

If x increases by 1 unit, y increases by 1 percent e)


data1$z <- rnorm(29,0,1)
model3e <- lm(y ~ X1 + X2 + X3 + z, data = data1)
summary(model3e)

##
## Call:
## lm(formula = y ~ X1 + X2 + X3 + z, data = data1)
##
## Residuals:
## Min 1Q Median 3Q Max
## -63.81 -24.32 6.45 22.87 49.88
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 67.0428 23.7294 2.825 0.00936 **
## X1 0.6968 0.7816 0.891 0.38154
## X2 -41.7872 89.0088 -0.469 0.64297
## X3 1.7727 0.8532 2.078 0.04860 *
## z -8.4557 6.7674 -1.249 0.22354
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 34.51 on 24 degrees of freedom
## Multiple R-squared: 0.6622, Adjusted R-squared: 0.606
## F-statistic: 11.76 on 4 and 24 DF, p-value: 1.972e-05

Question no. 4—-


#a-d
par(mfcol = c(3,2))
alpha_1 <- vector(length = 6)
alpha_1 <- c(1, 2, 0.1, 0.5, 0.8, 0.95)
alpha_0 <- c(1, 2, 0.1, 0.5, 0.8, 0.95)
Y <- matrix(nrow = 100, ncol = 6)
Y[1,1] <- 3
for (i in 1:6) {
Y[1,i] <- 3
for (j in 2:100) {
Y[j,i] <- alpha_0[i] + alpha_1[i]*Y[j-1,i] + rnorm(1,0,3)
}
plot(Y[,i], type = "l", main = alpha_1[j])
}
#e

data2$t <- c(1:28)

#Now, estimating the model

model4e <- lm(`Consumption as % of GDP`[2:28] ~ t[2:28] + `Consumption as %


of GDP`[1:27]+
`Remmitance as % of GDP`[2:28] + `Remmitance as % of
GDP`[1:27], data = data2)
summary(model4e)

##
## Call:
## lm(formula = `Consumption as % of GDP`[2:28] ~ t[2:28] + `Consumption as %
of GDP`[1:27] +
## `Remmitance as % of GDP`[2:28] + `Remmitance as % of GDP`[1:27],
## data = data2)
##
## Residuals:
## Min 1Q Median 3Q Max
## -4.590 -1.399 -0.137 1.078 5.340
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 91.87505 22.14617 4.149 0.00042 ***
## t[2:28] -0.34019 0.25928 -1.312 0.20303
## `Consumption as % of GDP`[1:27] -0.04658 0.24843 -0.188 0.85298
## `Remmitance as % of GDP`[2:28] 0.45698 0.23829 1.918 0.06822 .
## `Remmitance as % of GDP`[1:27] -0.02574 0.26326 -0.098 0.92300
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 2.624 on 22 degrees of freedom
## Multiple R-squared: 0.3864, Adjusted R-squared: 0.2748
## F-statistic: 3.463 on 4 and 22 DF, p-value: 0.02439

#g

model4g <- lm(`Consumption as % of GDP`~`Remmitance as % of GDP`, data =


data2)
summary(model4g)

##
## Call:
## lm(formula = `Consumption as % of GDP` ~ `Remmitance as % of GDP`,
## data = data2)
##
## Residuals:
## Min 1Q Median 3Q Max
## -5.9681 -0.9062 -0.0252 1.3627 5.2219
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 86.67480 0.86204 100.546 < 2e-16 ***
## `Remmitance as % of GDP` 0.16544 0.04893 3.381 0.00229 **
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 2.577 on 26 degrees of freedom
## Multiple R-squared: 0.3054, Adjusted R-squared: 0.2787
## F-statistic: 11.43 on 1 and 26 DF, p-value: 0.002291

Question no. 5 —-
Question no. 6—–
Now, we specify two different models
mod.rur <- glm(poor~hhsize, data = data3, subset = c(urbrur == "rural"),
family = binomial(link= "logit"))

mod.urb <- glm(poor~hhsize, data = data3, subset = c(urbrur == "urban"),


family = binomial(link = "logit"))
n = 100

Now we calculate the probability and of being poor for rural model
i <- 1:n
y <- odds <- prob <- vector(length = n)
for (i in 1:n){
y[i] <- mod.rur$coefficients[1] + mod.rur$coefficients[2]*i
prob[i] <- exp(y[i])/(1+exp(y[i]))
}
odds <- prob/ (1-prob)
c <- 1
rur <- cbind(1:n, c, prob, odds)
rurdata <- data.frame(rur)

Similarly, we calculate probability and of being poor in urban


i <- 1:n
y <- odds <- prob <- vector(length = n)
for (i in 1:n){
y[i] <- mod.urb$coefficients[1] + mod.urb$coefficients[2]*i
prob[i] <- exp(y[i])/(1+exp(y[i]))

}
c<- 0
odds <- prob/(1-prob)
urb <- cbind(1:n, c, prob, odds)
urbdata <- data.frame(urb)

Now we merge the two data frame. The c in those data frame is a dummy variable which
seperates the urban household from rural house hold.
maindata <- rbind(rurdata, urbdata)

Now we are ready to plot the data. We change the column names and change the type of
dummy which will make the graph more clear.
colnames(maindata) <- c("household.size", "dummy", "probability", "odds")
region = recode(maindata$dummy, '1' = "Rural", '0'= "Urban")
maindata <- add_column(maindata, region)

Now, we plot the data to find the probability for rural and urban household differently
maindata %>%
ggplot(aes(household.size, probability , color = region))+
geom_jitter()+
labs(x = "Household size", y = "probability of being poor", title =
"Probability of being poor for rural and urban households given the size")+
scale_color_manual(values = c("Red", "Dark Blue"))+
theme_classic()

Now, we plot the


household size for rural and urban household in same plot. There are various ways of doing
it. Initially we see the both in same plot and we will look for frequency
data3%>%
ggplot(aes(x =hhsize, color = 'Black', fill = urbrur))+
geom_histogram(binwidth = 1,
alpha = 0.5,
color = "Black",
position = 'dodge')+
scale_fill_manual(values = c("Dark Blue", "Red"))+
theme_bw()
Now let us see the
same plot for density
data3%>%
ggplot(aes(x =hhsize, color = 'Black', fill = urbrur))+
geom_histogram(aes(y = ..density..),
binwidth = 1,
alpha = 0.5,
color = "Black",
position = 'dodge')+
scale_fill_manual(values = c("Dark Blue", "Red"))+
theme_bw()

## Warning: The dot-dot notation (`..density..`) was deprecated in ggplot2


3.4.0.
## ℹ Please use `after_stat(density)` instead.
Now we plot the odds
maindata %>%
ggplot(aes(household.size, odds, color = region))+
geom_jitter()+
labs(x = "Household size", y = "Odds")+
scale_color_manual(values = c("Red", "Dark Blue"))+
theme_classic()
Now we plot the
log odds
maindata %>%
ggplot(aes(household.size, log(odds), color = region))+
geom_jitter()+
labs(x = "Household size", y = "Log Odds")+
scale_color_manual(values = c("Red", "Dark Blue"))+
theme_classic()
Question no. 7–
model7a <- glm(poor~hhsize, family = binomial(link = logit) ,data = data3)
summary(model7a)

##
## Call:
## glm(formula = poor ~ hhsize, family = binomial(link = logit),
## data = data3)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -2.2740 -0.6288 -0.5559 -0.4317 2.3095
##
## Coefficients:
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) -2.86332 0.08631 -33.17 <2e-16 ***
## hhsize 0.26853 0.01454 18.47 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for binomial family taken to be 1)
##
## Null deviance: 5738.8 on 5987 degrees of freedom
## Residual deviance: 5361.9 on 5986 degrees of freedom
## AIC: 5365.9
##
## Number of Fisher Scoring iterations: 4

px <- model7a$fitted.values
odds <- px/(1-px)

log_odds <- log(odds)


par(mfcol = c(2,1))
plot(data3$hhsize, odds)
plot(data3$hhsize, log_odds)

plot(data3$hhsize, px)
Question no. 8
attach(data3)

## The following object is masked _by_ .GlobalEnv:


##
## region

totcons <- log(data3$consumption)


#a
model8a <- lm(totcons~hhsize)
summary(model8a)

##
## Call:
## lm(formula = totcons ~ hhsize)
##
## Residuals:
## Min 1Q Median 3Q Max
## -2.0130 -0.4655 -0.0654 0.4103 3.4592
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 10.993259 0.019217 572.07 <2e-16 ***
## hhsize -0.111874 0.003636 -30.77 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.6491 on 5986 degrees of freedom
## Multiple R-squared: 0.1366, Adjusted R-squared: 0.1364
## F-statistic: 946.8 on 1 and 5986 DF, p-value: < 2.2e-16

#b

data3$u_d <- ifelse(urbrur=="urban",1,0)


data3$b_d <- ifelse(belt=="terai",1,0)
attach(data3)

## The following object is masked _by_ .GlobalEnv:


##
## region

## The following objects are masked from data3 (pos = 3):


##
## belt, consumption, district_name, hhsize, poor, region, urbrur,
## xhnum, xhpsu

model8b <- lm(totcons ~ data3$b_d+data3$u_d+hhsize)


summary(model8b)

##
## Call:
## lm(formula = totcons ~ data3$b_d + data3$u_d + hhsize)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.82150 -0.37238 -0.01757 0.35005 2.80702
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 10.703438 0.018782 569.867 < 2e-16 ***
## data3$b_d -0.090239 0.015098 -5.977 2.4e-09 ***
## data3$u_d 0.665623 0.015479 43.001 < 2e-16 ***
## hhsize -0.092206 0.003206 -28.759 < 2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.5637 on 5984 degrees of freedom
## Multiple R-squared: 0.349, Adjusted R-squared: 0.3487
## F-statistic: 1069 on 3 and 5984 DF, p-value: < 2.2e-16

#c
model8c <- lm(totcons ~ data3$b_d+u_d+I(data3$b_d*u_d)+hhsize)
summary(model8c)

##
## Call:
## lm(formula = totcons ~ data3$b_d + u_d + I(data3$b_d * u_d) +
## hhsize)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.90664 -0.37193 -0.02231 0.33069 2.72062
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 10.633448 0.018983 560.156 < 2e-16 ***
## data3$b_d 0.064792 0.017935 3.613 0.000306 ***
## u_d 0.838943 0.018933 44.311 < 2e-16 ***
## I(data3$b_d * u_d) -0.482244 0.031463 -15.327 < 2e-16 ***
## hhsize -0.091752 0.003145 -29.170 < 2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.553 on 5983 degrees of freedom
## Multiple R-squared: 0.3736, Adjusted R-squared: 0.3732
## F-statistic: 892.2 on 4 and 5983 DF, p-value: < 2.2e-16

#d
model8d <- lm(totcons ~ data3$b_d+u_d+I(data3$b_d*u_d)+hhsize +
I(hhsize*u_d+
I(data3$b_d*hhsize)))
summary(model8d)

##
## Call:
## lm(formula = totcons ~ data3$b_d + u_d + I(data3$b_d * u_d) +
## hhsize + I(hhsize * u_d + I(data3$b_d * hhsize)))
##
## Residuals:
## Min 1Q Median 3Q Max
## -1.90599 -0.37167 -0.02396 0.33353 2.73001
##
## Coefficients:
## Estimate Std. Error t value Pr(>|
t|)
## (Intercept) 10.704642 0.026610 402.281 <
2e-16
## data3$b_d -0.028190 0.030257 -0.932
0.351540
## u_d 0.750566 0.029912 25.092 <
2e-16
## I(data3$b_d * u_d) -0.483956 0.031431 -15.398 <
2e-16
## hhsize -0.106812 0.005047 -21.165 <
2e-16
## I(hhsize * u_d + I(data3$b_d * hhsize)) 0.019193 0.005033 3.813
0.000138
##
## (Intercept) ***
## data3$b_d
## u_d ***
## I(data3$b_d * u_d) ***
## hhsize ***
## I(hhsize * u_d + I(data3$b_d * hhsize)) ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 0.5524 on 5982 degrees of freedom
## Multiple R-squared: 0.3751, Adjusted R-squared: 0.3746
## F-statistic: 718.3 on 5 and 5982 DF, p-value: < 2.2e-16

plot(hhsize,totcons)

plot(b_d,totcons)
plot(u_d, totcons)
Question no.9
#creating data for y = 2 + 0.85*y(t-1) + u
#For T = 1000
T <- 1000
t <- c(1:1000)
y <- vector(length = 1000)
y <- 2 + 0.1*t + rnorm(1000,0,sqrt(10))
plot(y,type="l")

#estimating null model y = beta + u


model9a <- lm(y~1)
summary(model9a)

##
## Call:
## lm(formula = y ~ 1)
##
## Residuals:
## Min 1Q Median 3Q Max
## -56.183 -25.523 0.238 24.502 58.488
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 52.0677 0.9179 56.72 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 29.03 on 999 degrees of freedom

#now testing for independence, u = a + b*y(t-1) + u


aux19a <- lm(model9a$residuals[2:T]~y[1:T-1])
summary(aux19a)

##
## Call:
## lm(formula = model9a$residuals[2:T] ~ y[1:T - 1])
##
## Residuals:
## Min 1Q Median 3Q Max
## -14.5541 -2.9640 0.1376 3.0354 15.2042
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -51.333844 0.291100 -176.3 <2e-16 ***
## y[1:T - 1] 0.987882 0.004889 202.1 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 4.478 on 997 degrees of freedom
## Multiple R-squared: 0.9762, Adjusted R-squared: 0.9761
## F-statistic: 4.084e+04 on 1 and 997 DF, p-value: < 2.2e-16

#mean heterogenous
aux29a <- lm(model9a$residuals~t)
summary(aux29a)

##
## Call:
## lm(formula = model9a$residuals ~ t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -10.0597 -2.0256 -0.0911 2.0623 10.5955
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -4.999e+01 2.046e-01 -244.4 <2e-16 ***
## t 9.988e-02 3.541e-04 282.1 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 3.232 on 998 degrees of freedom
## Multiple R-squared: 0.9876, Adjusted R-squared: 0.9876
## F-statistic: 7.958e+04 on 1 and 998 DF, p-value: < 2.2e-16
#varience heterogenous
aux39a <- lm(I(model9a$residuals^2) ~ t)
summary(aux39a)

##
## Call:
## lm(formula = I(model9a$residuals^2) ~ t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -843.1 -682.8 -210.6 525.2 2588.2
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 851.28438 48.57859 17.524 <2e-16 ***
## t -0.01904 0.08408 -0.226 0.821
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 767.5 on 998 degrees of freedom
## Multiple R-squared: 5.14e-05, Adjusted R-squared: -0.0009506
## F-statistic: 0.0513 on 1 and 998 DF, p-value: 0.8209

#testing for normality. Initially finding skewness and kurtosis


s9a <- skewness(y)
k9a <- kurtosis(y)

jb9a <- (T/6)*((s9a^2) +(1/4)*(k9a-3)^2 )


ifelse(jb9a>5.99,"Non Normal", "Normal")

## [1] "Non Normal"

#b

T <- 1000
t <- c(1:1000)
y <- vector(length = 1000)
y <- 2 + 0.1*t - 0.01*(t^2) + rnorm(1000,0,sqrt(50))
plot(y,type="l")
#estimating null model Y = del + del1t
model9b <- lm(y~t)
summary(model9b)

##
## Call:
## lm(formula = y ~ t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1663.0 -573.4 205.1 674.3 850.8
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1673.73260 47.23321 35.44 <2e-16 ***
## t -9.91022 0.08175 -121.23 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 746.3 on 998 degrees of freedom
## Multiple R-squared: 0.9364, Adjusted R-squared: 0.9363
## F-statistic: 1.47e+04 on 1 and 998 DF, p-value: < 2.2e-16

#now testing for independence, u = a + b*y(t-1) + u


aux19b <- lm(model9b$residuals[2:T]~t[2:T]+y[1:T-1])
summary(aux19b)
##
## Call:
## lm(formula = model9b$residuals[2:T] ~ t[2:T] + y[1:T - 1])
##
## Residuals:
## Min 1Q Median 3Q Max
## -31.074 -6.755 0.072 6.803 32.747
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -1.673e+03 9.502e-01 -1761 <2e-16 ***
## t[2:T] 9.889e+00 4.327e-03 2285 <2e-16 ***
## y[1:T - 1] 9.999e-01 4.229e-04 2364 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 9.946 on 996 degrees of freedom
## Multiple R-squared: 0.9998, Adjusted R-squared: 0.9998
## F-statistic: 2.795e+06 on 2 and 996 DF, p-value: < 2.2e-16

#heterogeneous mean
aux29b <- lm(model9b$residuals~t)
summary(aux29b)

##
## Call:
## lm(formula = model9b$residuals ~ t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1663.0 -573.4 205.1 674.3 850.8
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.180e-12 4.723e+01 0 1
## t -2.615e-15 8.175e-02 0 1
##
## Residual standard error: 746.3 on 998 degrees of freedom
## Multiple R-squared: 5.496e-31, Adjusted R-squared: -0.001002
## F-statistic: 5.485e-28 on 1 and 998 DF, p-value: 1

#heterogeneous variance
aux39b <- lm(I(model9b$residuals^2)~ t)
summary(aux39b)

##
## Call:
## lm(formula = I(model9b$residuals^2) ~ t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -556167 -430660 -133494 119817 2209131
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 5.551e+05 3.763e+04 14.75 <2e-16 ***
## t 1.295e+00 6.513e+01 0.02 0.984
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 594600 on 998 degrees of freedom
## Multiple R-squared: 3.963e-07, Adjusted R-squared: -0.001002
## F-statistic: 0.0003955 on 1 and 998 DF, p-value: 0.9841

#testing for normality


s9b <- skewness(y)
k9b <- kurtosis(y)

jb9b <- (T/6)*((s9b^2) +(1/4)*(k9b-3)^2 )


ifelse(jb9b>5.99,"Non Normal", "Normal")

## [1] "Non Normal"

Question no. 10
#creating data
T <- 1000
t <- c(1:1000)
y <- vector(length = 1000)
y[1] <- 2
for (i in 2:T){
y[i] <- 2 + 0.1*t[i] + 0.95*y[i-1] + rnorm(1,0,sqrt(50))
}
plot(y,type="l")
#estimating null model
model10 <- lm(y~1)
summary(model10)

##
## Call:
## lm(formula = y ~ 1)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1000.02 -494.43 -7.29 483.17 997.06
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1002.02 18.15 55.2 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 574 on 999 degrees of freedom

#Markov(1) test
aux110 <- lm(model10$residuals[2:T]~y[1:T-1])
summary(aux110)

##
## Call:
## lm(formula = model10$residuals[2:T] ~ y[1:T - 1])
##
## Residuals:
## Min 1Q Median 3Q Max
## -25.0750 -4.4411 0.3368 4.5955 24.9411
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -9.999e+02 4.462e-01 -2241 <2e-16 ***
## y[1:T - 1] 9.999e-01 3.868e-04 2585 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 7.007 on 997 degrees of freedom
## Multiple R-squared: 0.9999, Adjusted R-squared: 0.9999
## F-statistic: 6.682e+06 on 1 and 997 DF, p-value: < 2.2e-16

#mean heterogenous
aux210 <- lm(model10$residuals~t)
summary(aux210)

##
## Call:
## lm(formula = model10$residuals ~ t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -67.658 -12.709 0.612 15.524 55.547
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -9.940e+02 1.348e+00 -737.3 <2e-16 ***
## t 1.986e+00 2.333e-03 851.2 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 21.3 on 998 degrees of freedom
## Multiple R-squared: 0.9986, Adjusted R-squared: 0.9986
## F-statistic: 7.245e+05 on 1 and 998 DF, p-value: < 2.2e-16

#variance heterogenous
aux310 <- lm(I(model10$residuals^2) ~ t)
summary(aux310)

##
## Call:
## lm(formula = I(model10$residuals^2) ~ t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -329445 -273297 -88292 216753 679927
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 320102.22 18753.84 17.069 <2e-16 ***
## t 18.04 32.46 0.556 0.578
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 296300 on 998 degrees of freedom
## Multiple R-squared: 0.0003095, Adjusted R-squared: -0.0006921
## F-statistic: 0.309 on 1 and 998 DF, p-value: 0.5784

#normality
s10 <- skewness(y)
k10 <- kurtosis(y)

jb10 <- (T/6)*((s10^2) +(1/4)*(k10-3)^2 )


ifelse(jb10>5.99,"Non Normal", "Normal")

## [1] "Non Normal"

#linearity
aux410 <- lm(model10$residuals[2:T]~y[1:T-1]+I(y[1:T-1]^2))
summary(aux410)

##
## Call:
## lm(formula = model10$residuals[2:T] ~ y[1:T - 1] + I(y[1:T -
## 1]^2))
##
## Residuals:
## Min 1Q Median 3Q Max
## -25.0906 -4.4419 0.3483 4.5827 24.9099
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -9.999e+02 6.823e-01 -1465.47 <2e-16 ***
## y[1:T - 1] 9.998e-01 1.562e-03 639.97 <2e-16 ***
## I(y[1:T - 1]^2) 6.778e-08 7.509e-07 0.09 0.928
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 7.011 on 996 degrees of freedom
## Multiple R-squared: 0.9999, Adjusted R-squared: 0.9999
## F-statistic: 3.338e+06 on 2 and 996 DF, p-value: < 2.2e-16

#homskedasticity
aux510 <- lm(I(model10$residuals[2:T]^2)~y[1:T-1]+I(y[1:T-1]^2))
summary(aux510)

##
## Call:
## lm(formula = I(model10$residuals[2:T]^2) ~ y[1:T - 1] + I(y[1:T -
## 1]^2))
##
## Residuals:
## Min 1Q Median 3Q Max
## -37209 -3695 -173 3556 45471
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 9.995e+05 7.919e+02 1262 <2e-16 ***
## y[1:T - 1] -1.999e+03 1.813e+00 -1102 <2e-16 ***
## I(y[1:T - 1]^2) 9.993e-01 8.715e-04 1147 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 8137 on 996 degrees of freedom
## Multiple R-squared: 0.9992, Adjusted R-squared: 0.9992
## F-statistic: 6.58e+05 on 2 and 996 DF, p-value: < 2.2e-16

Question no.11
#q no 11 -----
#creating data
T <- 1000
t <- c(1:1000)
y <- x <- vector(length = 1000)
x <- rnorm(T,25,sqrt(35))
y <- 20 + 2.5*x + 0.8*(x^2) + rnorm(1000,0,sqrt(5))

plot(y,type="l")
#estimating null model
model11 <- lm(y~x)
summary(model11)

##
## Call:
## lm(formula = y ~ x)
##
## Residuals:
## Min 1Q Median 3Q Max
## -33.70 -24.17 -14.54 8.15 379.60
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -446.8145 5.3939 -82.84 <2e-16 ***
## x 42.2750 0.2112 200.16 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 39.21 on 998 degrees of freedom
## Multiple R-squared: 0.9757, Adjusted R-squared: 0.9757
## F-statistic: 4.006e+04 on 1 and 998 DF, p-value: < 2.2e-16

#now testing for independence, u = a + b*y(t-1) + u


aux1.11 <- lm(model11$residuals[2:T]~x[2:T]+x[1:T-1])
summary(aux1.11)
##
## Call:
## lm(formula = model11$residuals[2:T] ~ x[2:T] + x[1:T - 1])
##
## Residuals:
## Min 1Q Median 3Q Max
## -34.43 -24.12 -14.48 9.13 378.91
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -5.2431484 7.5413247 -0.695 0.487
## x[2:T] -0.0005503 0.2112866 -0.003 0.998
## x[1:T - 1] 0.2124021 0.2113988 1.005 0.315
##
## Residual standard error: 39.22 on 996 degrees of freedom
## Multiple R-squared: 0.001013, Adjusted R-squared: -0.0009934
## F-statistic: 0.5048 on 2 and 996 DF, p-value: 0.6038

#heterogeneity mean
aux2.11 <- lm(model11$residuals~x+t)
summary(aux2.11)

##
## Call:
## lm(formula = model11$residuals ~ x + t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -35.62 -23.97 -14.63 8.08 381.08
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 3.189047 5.963822 0.535 0.593
## x -0.019761 0.211741 -0.093 0.926
## t -0.005390 0.004306 -1.252 0.211
##
## Residual standard error: 39.2 on 997 degrees of freedom
## Multiple R-squared: 0.001569, Adjusted R-squared: -0.0004336
## F-statistic: 0.7835 on 2 and 997 DF, p-value: 0.4571

#heterogeneous variance
aux3.11 <- lm(I(model11$residuals^2) ~ x + t)
summary(aux3.11)

##
## Call:
## lm(formula = I(model11$residuals^2) ~ x + t)
##
## Residuals:
## Min 1Q Median 3Q Max
## -1673 -1364 -1039 -768 142238
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1786.3509 950.7302 1.879 0.0605 .
## x -12.5706 33.7549 -0.372 0.7097
## t 0.1209 0.6865 0.176 0.8602
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 6249 on 997 degrees of freedom
## Multiple R-squared: 0.000181, Adjusted R-squared: -0.001825
## F-statistic: 0.09025 on 2 and 997 DF, p-value: 0.9137

#normality
s.11 <- skewness(y)
k.11 <- kurtosis(y)

jb.11 <- (T/6)*((s.11^2) +(1/4)*(k.11-3)^2 )


ifelse(jb.11>5.99,"Non Normal", "Normal")

## [1] "Non Normal"

#linearity
aux4.11 <- lm(model11$residuals~x+I(x^2))
summary(aux4.11)

##
## Call:
## lm(formula = model11$residuals ~ x + I(x^2))
##
## Residuals:
## Min 1Q Median 3Q Max
## -6.3391 -1.3830 -0.0897 1.5679 6.6745
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 466.305244 0.871037 535.3 <2e-16 ***
## x -39.742054 0.070712 -562.0 <2e-16 ***
## I(x^2) 0.799531 0.001403 569.9 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 2.17 on 997 degrees of freedom
## Multiple R-squared: 0.9969, Adjusted R-squared: 0.9969
## F-statistic: 1.624e+05 on 2 and 997 DF, p-value: < 2.2e-16

#homskedasticity
aux5.11 <- lm(I(model11$residuals^2)~x+I(x^2))
summary(aux5.11)
##
## Call:
## lm(formula = I(model11$residuals^2) ~ x + I(x^2))
##
## Residuals:
## Min 1Q Median 3Q Max
## -5534 -2093 535 2022 97359
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 56736.088 1693.281 33.51 <2e-16 ***
## x -4690.149 137.464 -34.12 <2e-16 ***
## I(x^2) 94.095 2.727 34.50 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 4219 on 997 degrees of freedom
## Multiple R-squared: 0.5442, Adjusted R-squared: 0.5433
## F-statistic: 595.3 on 2 and 997 DF, p-value: < 2.2e-16

Question no.12
#creating data
T <- 1000
t <- c(1:1000)
y <- x <- vector(length = 1000)
x <- rnorm(T,25,sqrt(35))
d_t <- ifelse(t<=(T/2),1,0)
y[1] <-20 + 100*d_t[1] + 2*x[1] + rnorm(1,0,sqrt(15))
y[2] <- 20 + 100*d_t[2] + 2*x[2] + 0.85*y[1] + rnorm(1,0,sqrt(15))
for (i in 3:T){
y[i] <- 20 + 100*d_t[i] + 2*x[i] + 0.85*y[i-1] - 0.1*y[i-2] +
rnorm(1,0,sqrt(15))
}
plot(y,type="l")
#estimating null model
model12 <- lm(y[2:T]~x[2:T]+y[1:T-1])
summary(model12)

##
## Call:
## lm(formula = y[2:T] ~ x[2:T] + y[1:T - 1])
##
## Residuals:
## Min 1Q Median 3Q Max
## -102.426 -4.288 -0.148 4.084 135.305
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -48.467157 1.582737 -30.62 <2e-16 ***
## x[2:T] 2.019866 0.054224 37.25 <2e-16 ***
## y[1:T - 1] 0.995379 0.001591 625.60 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 10.07 on 996 degrees of freedom
## Multiple R-squared: 0.9975, Adjusted R-squared: 0.9975
## F-statistic: 1.969e+05 on 2 and 996 DF, p-value: < 2.2e-16

#now testing for independence, u = a + b*y(t-1) + u


aux1.12 <- lm(model12$residuals[2:T]~x[2:T]+x[1:T-1])
summary(aux1.12)
##
## Call:
## lm(formula = model12$residuals[2:T] ~ x[2:T] + x[1:T - 1])
##
## Residuals:
## Min 1Q Median 3Q Max
## -104.521 -3.426 0.227 3.633 111.409
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 17.65819 1.68770 10.463 < 2e-16 ***
## x[2:T] -0.29755 0.04643 -6.409 2.26e-10 ***
## x[1:T - 1] -0.41012 0.04634 -8.850 < 2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 8.625 on 995 degrees of freedom
## (1 observation deleted due to missingness)
## Multiple R-squared: 0.1054, Adjusted R-squared: 0.1036
## F-statistic: 58.61 on 2 and 995 DF, p-value: < 2.2e-16

#heterogeneity mean
aux2.12 <- lm(model12$residuals[2:T]~x[2:T]+y[1:T-1]+t[2:T])
summary(aux2.12)

##
## Call:
## lm(formula = model12$residuals[2:T] ~ x[2:T] + y[1:T - 1] + t[2:T])
##
## Residuals:
## Min 1Q Median 3Q Max
## -96.893 -4.154 0.039 4.448 93.249
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 24.181863 2.408337 10.041 < 2e-16 ***
## x[2:T] -0.290956 0.046556 -6.250 6.10e-10 ***
## y[1:T - 1] -0.019129 0.002617 -7.309 5.51e-13 ***
## t[2:T] -0.015599 0.001820 -8.572 < 2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 8.649 on 994 degrees of freedom
## (1 observation deleted due to missingness)
## Multiple R-squared: 0.1014, Adjusted R-squared: 0.09868
## F-statistic: 37.39 on 3 and 994 DF, p-value: < 2.2e-16

#heterogeneous variance
aux3.12 <- lm(I(model12$residuals[2:T]^2) ~ x[2:T]+y[1:T-1]+t[2:T])
summary(aux3.12)
##
## Call:
## lm(formula = I(model12$residuals[2:T]^2) ~ x[2:T] + y[1:T - 1] +
## t[2:T])
##
## Residuals:
## Min 1Q Median 3Q Max
## -188.8 -100.6 -48.3 5.5 10805.9
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 463.1729 163.8864 2.826 0.00481 **
## x[2:T] 1.8428 3.1681 0.582 0.56093
## y[1:T - 1] -0.4783 0.1781 -2.686 0.00735 **
## t[2:T] -0.3928 0.1238 -3.172 0.00156 **
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 588.6 on 994 degrees of freedom
## (1 observation deleted due to missingness)
## Multiple R-squared: 0.01037, Adjusted R-squared: 0.007382
## F-statistic: 3.471 on 3 and 994 DF, p-value: 0.0157

#normality
s.12 <- skewness(y)
k.12 <- kurtosis(y)

jb.12 <- (T/6)*((s.12^2) +(1/4)*(k.12-3)^2 )


ifelse(jb.12>5.99,"Non Normal", "Normal")

## [1] "Non Normal"

#linearity
aux4.12 <- lm(model12$residuals[2:T]~x[2:T]+y[1:T-1]+I(x[2:T]^2)+I(y[1:T-
1]^2))
summary(aux4.12)

##
## Call:
## lm(formula = model12$residuals[2:T] ~ x[2:T] + y[1:T - 1] + I(x[2:T]^2) +
## I(y[1:T - 1]^2))
##
## Residuals:
## Min 1Q Median 3Q Max
## -100.642 -4.085 -0.093 3.951 103.676
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.404e+01 7.235e+00 1.941 0.0526 .
## x[2:T] 1.248e-01 2.958e-01 0.422 0.6733
## y[1:T - 1] -5.956e-02 3.125e-02 -1.906 0.0569 .
## I(x[2:T]^2) -8.271e-03 5.865e-03 -1.410 0.1587
## I(y[1:T - 1]^2) 6.206e-05 3.246e-05 1.912 0.0562 .
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 8.942 on 993 degrees of freedom
## (1 observation deleted due to missingness)
## Multiple R-squared: 0.04046, Adjusted R-squared: 0.03659
## F-statistic: 10.47 on 4 and 993 DF, p-value: 2.622e-08

#homskedasticity
aux5.12 <- lm(I(model12$residuals^2) ~ x[2:T]+y[1:T-1]+I(x[2:T]^2))
summary(aux5.12)

##
## Call:
## lm(formula = I(model12$residuals^2) ~ x[2:T] + y[1:T - 1] + I(x[2:T]^2))
##
## Residuals:
## Min 1Q Median 3Q Max
## -134.0 -102.4 -76.8 -37.0 18168.1
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -50.84193 345.34716 -0.147 0.883
## x[2:T] 17.22808 27.32142 0.631 0.528
## y[1:T - 1] -0.09618 0.13067 -0.736 0.462
## I(x[2:T]^2) -0.35237 0.54166 -0.651 0.515
##
## Residual standard error: 826.1 on 995 degrees of freedom
## Multiple R-squared: 0.001031, Adjusted R-squared: -0.001981
## F-statistic: 0.3422 on 3 and 995 DF, p-value: 0.7948

You might also like