ACTL2102 Final Notes
ACTL2102 Final Notes
ACTL2102 Final Notes
Markov Process
o Pr( X(tn) < xn| X(tn-1) = xn-1 )
o Implication; only the current state influences the future
o Markov chain is just a markov process for discrete index set
with discrete state space
o Hence Markov Chain; Pr( Xt+1 = xt+1 | Xt = xt )
o Conditional probabilities; Pij(t,t+1) = Pr( Xt+1 = j | Xt = i) also
known as one step transition probabilities
stationary
Classification of States
o Absorbing State if Pii = 1
o Considered accessible from state i if Pij > 0
o States communicate if you can move of another state and
then return
o If two states communicate then, the are in the same class
Period of States
o d(i) is the greatest common divisor of all n>=1
o If Piin = 0 for all n, then d(i) = 0 means impossible to go back
to the state
o A state with period 1 is aperiodic (i.e. in the long run can go
back to state at each period)
o Period is a class property, therefore period of all states in a
class is the same
Limiting probabilities
o If state is transient (i.e. fi < 1) then limiting Pijt = 0
o For irreducible and ergodic MC, limiting Pijt = j and the limiting
probability is independent on the current state i
o The limiting probability is also the probability of the process
being in that state
o Also equals the long run proportion of time the MC is in that
state
o In matrix format; 0 = (P-I)T *
o To solve the matrix, must also use the sum of limiting
probabilities = 1
o j = 1/mjj, where mjj is the expected number of transitions until
a MC returns to state it started in
Time Reversible MC
o Used for Monte Carlo Simulations and finding limiting
probabilities
o MC is time reversible if Qij = Pij for all i and j
o Qij = Pr (Xt = j | Xt+1 = i)
o For time reversible MC, the rate at which process goes from
state i to state j is the same as rate a which it goes from j to I
(i.e. i * Pij = j * Pji )
Branching Process
o If each individual and produce j off-springs with probability pj
then;
Important properties
o Distribution of waiting times for Poisson processes follow
Gamma (n,) where n is the number of events and is the
rate at which each event occurs
o Function; min (X1, X2, .) follows exponential with rate =
sum of all
o Probability Xi is the smallest is given by i divided by sum of all
Poisson Process
o Special counting process where N(0) = 0, with independent
increments + stationary increments and number of events in
any interval = *t, where t is length of interval
o Alternative definition; N(0) = 0, independent & stationary
increments an satisfied the following conditions
Inter-arrival Time;
o Tn is the time between (n-1)th and nth event. It follows
exponential RV with rate .
o In Poisson process, the time between has an i.i.d. Exponential
distribution and since it is memory less, the time until next
claim is independent of time elapsed since last claim
Waiting Time;
o Waiting time = arrival time of the nth event (i.e. Sn = T1 + T2 +
. + Tn )
o Follows Gamma Distribution (n, )
o Convolutions of exponential variables; when the rate for T1, T2
is different
Sn is said to have a hypo exponential distribution
When n=2, use the following to find the probability
density function
General case;
individual pdf
Arrival Times distribution; Given that the timings are uniformly
distributed; f(s1,s2,..,sn | N(t) = n) = n!/tn
Sj = cumulative time for j events to occur
Number of claims distribution;
First Holding timing; the time spent in state before the first jump is a
memory less function
o Follows Exponential Distribution with parameter vi is the rate
at which the process makes a transition when in state i
o E(Ti) = 1/vi
o Pr(Ti > s); occupancy probability
o PDF of Occupancy probability; vi*exp(-vi*t)
o Occupation time; total time process spends in process j during
the (0,t) given that starts in state i
o Calculating Occupation time;
Note that for all Kolmogorov equation must satisfy the condition;
P(0) = Identity Matrix
Limiting probabilities (Pj) for MJP are independent of initial state and
exist if the following conditions are met;
o All states communicate
state j
WEEK 5 APPLCATIONS
Kolmogorovs Equations
Non-homogenous
General Case;
o Transition rates for forward equation;
Common Situations
o Simulating Standard Normal; if U < 0.5, set Z = -X, if U > 0.5,
set Z = X
o Generating Normal RV with mean u and variance o2; simulate
standard normal Z and set X = u + oZ,
o Simulating log-normal RV Y; Simulate Normal RV X then set Y
= exp(X)
o Gamma distribution (n, ); generating n exponential variables
and then adding then
Generating n random numbers from Uniform (0,1)
Set X = (-1/)* log(Ui)
o Simulating Poisson RV using
formula;
For odd degrees of freedom; simulate distribution with
degree n-1 and then add Z2, where Z is the simulated
value from standard normal distribution
the following;
o Using antithetic variables technique to reduce variance
Number of Simulations
o Selecting r so that the estimate is within a desired accuracy of
General Procedure
o Remove or model trend and seasonal effects
o Analyse Nt by choosing probability model, estimating unknown
parameters, checking model for goodness of fit and using
fitted model to enhance out understanding
Differencing method
Seasonal Differencing
o Dont understand
= (B)*Zt
Properties of stationarity;
o Covariance structure does not change over time
o Process can be represented by linear process
o {Xt} is stationary and Yt = (B) * Xt, then {Yt} is also
stationary
o Dont understand stationarity need to review later
Same as causal function, if roots to (z) are outside the unit circle
then the process is invertible