Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

New Assign

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

Question 3.

1
Pt +1
+1
Pt +1 + Dt +1 Dt +1 Dt +1
Rt +1= =
Pt +1 Pt Dt
Dt

Log-linearlizing, define:
r t =log ( Rt )

d t =log ( Dt )

pt =log ( Pt )

r t +1=log
( Pt +1
Dt +1 )
+1 − ( pt − dt ) + Δ d t+1

Thus:

r t +1=log ( 1+ e p t +1 − dt+ 1
) − ( pt − d t ) + Δ d t +1
Taking a Taylor series expansion of the term about a point P/D = exp(p-d):

log ( 1+ e p t+ 1 −d t+ 1
)
So:
p −d
log ( 1+ e
pt+ 1 −d t+ 1
)=log ( 1+ e p − d ) + e p − d ( p t +1 − d t +1 − ( p − d ) )
1+ e
Setting:
p− d
e
ρ= p−d
1+e

k =log 1+ ( P
D )
− ρ ( p − d )=log ( 1+e p −d ) − ρ ( p − d )

So we can substitute rho and k to get:

r t +1=log ( 1+ e p t +1 − dt+ 1
) − ( pt − d t ) + Δ d t +1
r t +1=k + ρ ( p − d ) + ρ ( pt +1 − d t +1 − ( p − d ) ) − ( pt − d t ) + Δ d t +1

Hence,using the identity can derive the following approximation


r t +1=k + ρ ( pt +1 − d t +1 )+ Δ d t +1 − ( pt − d t )
Question 3.2
By rearranging identity 4 we can get:
d t − p t=− k +r t +1 − Δ d t +1+ ρ ( p t+1 − dt +1 )

Then to find analytical relationship between coefficients b_r,b_d,b_y, we need to solve the
variance of d_t-p_t
var ( d t − pt )=cov ( d t − p t , d t − pt )

var ( d t − pt )=cov ( d t − p t , r t +1 ) − cov ( d t − pt , Δ d t +1 ) + ρcov ( d t − pt , pt +1 − d t +1 )

divide
cov ( d t − p t , r t +1 ) − cov ( d t − pt , Δ d t +1 )+ ρcov ( d t − pt , pt +1 − d t +1 )
1=
var ( d t − pt )

Then from these three regressions, we can get regression coefficients:


cov ( d t − p t ,r t+ 1)
b r=
var ( d t − pt )

cov ( d t − p t , Δ d t +1 )
b d=
var ( d t − pt )

cov ( d t − pt , pt +1 − d t +1 )
b y=
var ( d t − pt )

The following results can be obtained by dividing each term in the relationship above by
the denominator:
1=br −b d + ρ b y

Question 3.3
library(readxl)
library(dplyr)

##
## Attaching package: 'dplyr'

## The following objects are masked from 'package:stats':


##
## filter, lag

## The following objects are masked from 'package:base':


##
## intersect, setdiff, setequal, union
data_3<-read_xls("ps2_data.xls")
colnames(data_3)[4]<-"Return_inc_dividends"
colnames(data_3)[5]<-"Return_exc_dividends"
colnames(data_3)[6]<-"Rf"

##Compute ratios (the logarithm of the excess returns, the Dividend Growth,
D/P)
data_3 <- data_3 %>%
filter(year>1946 & year<2010) %>%
mutate(DP=(1+Return_inc_dividends)/(1+Return_exc_dividends)-1) %>%
mutate(Excess_r = log(Return_inc_dividends+1)-log(Rf +1)) %>%
mutate(Div_Growth=(DP/lag(DP))*(Return_exc_dividends+1))
data_3$Div_Growth[is.na(data_3$Div_Growth)]<-1
data_3<-data_3 %>%
mutate(Acc_Div_Growth=cumprod(Div_Growth))

##Regression relationships from part ii


##'tail' means at r_t+1, 'head' means at r_t
Regress_r<-lm(tail((data_3$Excess_r),-1) ~ log(head(data_3$DP,-1)))
Regress_d<-lm(log(tail(data_3$Acc_Div_Growth,-1)/head(data_3$Acc_Div_Growth,-
1))
~ log(head(data_3$DP,-1)))
Regress_y<-lm(log(tail(data_3$DP,-1))~log(head(data_3$DP,-1)))

##Beta Estimates by using direct regression


sum_r<-summary(Regress_r)
beta_r<-sum_r$coefficients[2,1]
sum_d<-summary(Regress_d)
beta_d<-sum_d$coefficients[2,1]
sum_y<-summary(Regress_y)
beta_y<-sum_y$coefficients[2,1]

##Implied Beta r calculation using part ii regression


##First, we need to calculate rho
data_3<-data_3 %>%
summarise(PD=mean(1/DP))
rho<-data_3$PD/(data_3$PD+1)

##Then, we can get b_r by the relationship of part ii


implied_b_r<-1+beta_d-rho*beta_y

##Comparison of the implied return beta and the regression estimated beta
##by RMSE, implied_beta is predicted value, regress_beta is actual value
rmse<-sqrt((implied_b_r-beta_r)^2)
print(rmse)

## [1] 0.01043867
To sum up, after comparing the implied beta and the estimated regression beta, we
discovered RMSE is 0.01, which is extremely close to 0 and gives a strong evidence to
support the relationship of part ii. Hence, this comparison strengthens the blief in stock
market predictability.

You might also like