Solution To Selected Problems.: Chapter 1. Preliminaries
Solution To Selected Problems.: Chapter 1. Preliminaries
Solution To Selected Problems.: Chapter 1. Preliminaries
Chapter 1. Preliminaries
1. ∀A ∈ FS , ∀t ≥ 0, A ∩ {T ≤ t} = (A ∩ {S ≤ t}) ∩ {T ≤ t}, since {T ≤ t} ⊂ {S ≤ t}. Since
A ∩ {S ≤ t} ∈ Ft and {T ≤ t} ∈ Ft , A ∩ {T ≤ t} ∈ Ft . Thus FS ⊂ FT .
2. Let Ω = N and F = P(N) be the power set of the natural numbers. Let Fn = σ({2}, {3}, . . . , {n+
1}), ∀n. Then (Fn )n≥1 is a filtration. Let S = 3 · 13 and T = 4. Then S ≤ T and
(
{3} if n = 3
{ω : S(ω) = n} =
∅ otherwise
(
Ω if n = 4
{ω : T (ω) = n} =
∅ otherwise
Hence {S = n} ∈ Fn , {T = n} ∈ Fn , ∀n and S, T are both stopping time. However {ω : T − S =
1} = {ω : 1{3} (ω) = 1} = {3} ∈
/ F1 . Therefore T − S is not a stopping time.
3. Observe that {Tn ≤ t} ∈ Ft and {Tn < t} ∈ Ft for all n ∈ N , t ∈ R+ , since Tn is stopping
time and we assume usual hypothesis. Then
(1) supn Tn is a stopping time since ∀t ≥ 0, {supn Tn ≤ t} = ∩n {Tn ≤ t} ∈ Ft .
(2) inf n Tn is a stopping time since {inf n Tn < t} = ∪{Tn < t} ∈ Ft
(3) lim supn→∞ is a stopping time since lim supn→∞ = inf m supn≥m Tn (and (1), (2).)
(4) lim inf n→∞ is a stopping time since lim inf n→∞ = supm inf n≥m Tn (and (1), (2).).
(b) By (a), Mt ∈ Lp ⊂ L1 . For t ≥ s ≥ 0, E(Mt |Fs ) = E(E(X|Ft )|Fs ) = E(X|Fs ) = Ms a.s. {Mt }
is a martingale. Next, we show that {Mt } is continuous. By Jensen’s inequality, for p > 1,
E|Mtn − Mt |p = E|E(M∞
n
− X|Ft )|p ≤ E|M∞
n
− X|p , ∀t ≥ 0. (1)
n − X|p → 0 as n → ∞. Fix arbitrary ε > 0. By
It follows that supt E|Mtn − Mt |p ≤ E|M∞
Chebychev’s and Doob’s inequality,
µ ¶ µ ¶p
n 1 n p p supt E|Mtn − Mt |p
P sup |Mt − Mt | > ε ≤ p E(sup |Mt − Mt | ) ≤ → 0. (2)
t ε t p−1 εp
Therefore M n converges to M uniformly in probability. There exists a subsequence {nk } such that
Mnk converges uniformly to M with probability 1. Then M is continuous since for almost all ω, it
is a limit of uniformly convergent continuous paths.
1
6. Let p(n) denote a probability mass function of Poisson distribution with parameter λt. Assume
λt is integer as given.
λt
X
− −
E|Nt − λt| =E(Nt − λt) + 2E(Nt − λt) = 2E(Nt − λt) = 2 (λt − n)p(n)
n=0
λt
à λt λt−1
!
X (λt)n X (λt)n X (λt)n
=2e −λt
(λt − n) = 2λte−λt − (3)
n! n! n!
n=0 n=0 n=0
(λt)λt
=2e−λt
(λt − 1)!
E(Nt − Ns )2 = ENt−s
2
= V ar(Nt−s ) + (ENt−s )2 = λ(t − s)[1 + λ(t − s)]. (4)
Let Xi = (Nti − t)/i. Then {XPi }i is a sequence of independent random variables such that EXi = 0,
2 and hence ∞
V
P∞ar(X i ) = 1/i i=1 V ar(Xi ) < ∞. Therefore
P∞ Kolmogorov’s criterionPimplies that
X
i=1 i converges almost surely. On the other hand, i=1 t/i = ∞. Therefore, s≤t 4Ms =
P∞ i
i=1 Nt /i = ∞.
2
P P
10. (a) Let Nt = i 1i (Nti − t) and Lt = i 1i (Lit − t). As we show in exercise 9(a), N , M are
well defined in L2 sense. Then by linearity of L2 space M is also well defined in L2 sense since
X1£ ¤ X1 i X1
Mt = (Nti − t) − (Lit − t) = (Nt − t) − (Li − t) = Nt − Lt . (7)
i i i t
i i i
Both terms in right size are martingales change only by jumps as shown in exercise 9(b). Hence
Mt is a martingale which changes only by jumps.
P
(b) First show that given two independent Poisson processes N and L, s>0 4Ns 4Ls = 0 a.s.,
i.e. N and L almost Psurely don’t jump P
simultaneously. Let {Tn }n≥1 be aPsequence of jump times
of a process L. Then s>0 4Ns 4Ls = n 4NTn . We want to show that n 4NTn = 0 a.s. Since
4NTn ≥ 0, it is enough to show that E4NTn = 0 for ∀n ≥ 1.
Fix n ≥ 1 and let µTn be a induced probability measure on R+ of Tn . By conditioning,
Z ∞ Z ∞
E(4NTn ) = E [E (4NTn |Tn )] = E (4NTn |Tn = t) µTn (dt) = E (4Nt ) µTn (dt), (8)
0 0
where last equality is by independence of N and Tn . It follows that E4NTn = E4Nt . Since
4Nt ∈ L1 and P (4Nt = 0) = 1 by problem 25, E4Nt = 0, hence E4NTn = 0.
Next we show that the previous claim holds even when there are countably many Poisson processes.
assume that there exist countably many independent Poisson processes {N i }i≥1 . Let A ⊂ Ω be a
set on which more than two processes of {N i }i≥1 jump simultaneously. Let Ωij denotes a set on
which N i and N j don’t jump Psimultaneously. Then P (Ωij ) = 1 for i 6= j by previous assertion.
c
Since A ⊂ ∪i>j Ωij , P (A) ≤ c
i>j P (Ωij ) = 0. Therefore jumps don’t happen simultaneously
almost surely.
Going back to the main proof, by (a) and the fact that N and L don’t jump simultaneously, ∀t > 0,
X X X
|4Ms | = |4Ns | + |4Ls | = ∞ a.s. (9)
s≤t s≤t s≤t
11. Continuity: We use notations adopted in Example 2 in section 4 (P33). Assume E|U1 | < ∞.
By independence of Ui , elementary inequality, Markov inequality, and the property of Poisson
process, we observe
X
lim P (|Zt − Zs | > ²) = lim P (|Zt − Zs | > ² |Nt − Ns = k)P (Nt − Ns = k)
s→t s→t
k
X k
X Xh ² i
≤ lim P( |Ui | > ²)P (Nt − Ns = k) ≤ lim kP (|U1 | > ) P (Nt − Ns = k) (10)
s→t s→t k
k i=1 k
E|U1 | X 2 E|U1 |
≤ lim k P (Nt − Ns = k) = lim {λ(t − s)} = 0
s→t ² s→t ²
k
3
Independent Increment: Let F be a distribution function of U . By using independence of
{Uk }k and strong Markov property of N , for arbitrary t, s : t ≥ s ≥ 0,
³ ´ ³ PNt PNs ´
E eiu(Zt −Zs )+ivZs = E eiu k=Ns +1 Uk +iv k=1 Uk
³ ³ PNt PNs ´´
=E E eiu k=Ns +1 Uk +iv k=1 Uk |Fs
³ PNs ³ PNt ´´ ³ PNs ³ PNt ´´ (11)
=E eiv k=1 Uk E eiu k=Ns +1 Uk |Fs = E eiv k=1 Uk E eiu k=Ns +1 Uk
³ PNs ´ ³ PNt ´ ¡ ¢ ³ ´
=E eiv k=1 Uk E eiu k=Ns +1 Uk = E eivZs E eiu(Zt −Zs ) .
12. By exercise 12, a compound Poisson process is a Lévy process and has independent stationary
increments.
∞
à n !
X X
E|Zt − λtEU1 | ≤E(E(|Zt ||Nt )) + λtE|U1 | ≤ E |Ui ||Nt = n P (Nt = n) + λtE|U1 |
n=1 i=1
(14)
=E|U1 |ENt + λtE|U1 | = 2λtE|U1 | < ∞,
E(Zt − EU1 λt|Fs ) = Zs − EU1 λs a.s. and {Zt − EU1 λt}t≥0 is a martingale.
4
R R 0 R
where Zt0 = R xNt (·, dx), β = α− |x|≥1 xν(dx). By theorem 43, E(eiuZt ) = R (1−eiux )ν(dx). Zt0 is
R
aRcompound Poisson process (See problem 11). Arrival rate (intensity)Ris λ since E( R NRt (·, dx)) =
t R ν(dx) 0 0
R = λt. Since Zt is a martingale, EZ0
t = −βt.
R νSince EZt = E( R xNt (·, dx)) = t R xν(dx),
β = − R xν(dx). It follows that Zt = Zt −Rλt R x λ (dx) is a compensated compound Poisson
process. Then problem 12 shows that EU1 = R x λν (dx). It follows that the distribution of jumps
µ = (1/λ)ν.
Then
∞
X X∞
E(Zt |Fs ) = E(Zs + Ui 1{s<Ti ≤t} |Fs ) = Zs + E( Ui 1{s<Ti ≤t} |Fs )
i=1 i=1
∞
X ∞
X
=Zs + E(Ui E({1s<Ti ≤t} |Fs ∨ σ(Ui : i ≥ 1))|Fs ) = Zs + E(Ui E(1{s<Ti ≤t} |Fs )|Fs ) (18)
i=1 i=1
X∞
=Zs + E(Ui |Fs )E(1{Ti ≤t} |Fs )1{Ti >s} = Zs , a.s.
i=1
since E(Ui |Fs )1{Ti >s} = 0 a.s. Note that if we drop the assumption ENt < ∞, Zt is not a martingale
in general. (Though it is a local martingale.)
Zt = βk (Ntk − αk t) (20)
¡Pn ¢2 P
To verifyPthat Zt ∈ L2 , we observe that E t
k=m βk (Nk − αk t) = t nk=m βk2 αk → 0 as n, m →
∞since ∞ 2 2 2 2
k=1 βk < ∞. Therefore, Zt is a Cauchy limit in L . Since L -space is complete, Zt ∈ L .
5
16. Let Ft be natural filtration of Bt satisfying usual hypothesis. By stationary increments
property of Brownian motion and symmetry,
d d
Wt = B1−t − B1 = B1 − B1−t = B1−(1−t) = Bt (21)
d
This shows Wt is Gaussian. Wt has stationary increments because Wt − Ws = B1−s − B1−t = Bt−s .
Let Gt be a natural filtration of Wt . Then Gt = σ(−(B1 − B1−s ) : 0 ≤ s ≤ t). By independent
increments property of Bt , Gt is independent of F1−t . For s > t, Ws − Wt = −(B1−t − B1−s ) ∈ F1−t
and hence independent of Gt .
17. a) Fix ε > 0 and ω such that X· (ω) has a sample path which is right continuous, with left
limits. Suppose there exists infinitely many jumps larger than ε at time {sn }n≥1 ∈ [0, t]. (If there
are uncountably many such jumps, we can arbitrarily choose countably many of them.) Since [0, t]
is compact, there exists a subsequence {snk }k≥1 converging to a cluster point s∗ ∈ [0, 1]. Clearly
we can take further subsequence converging to s∗ ∈ [0, 1] monotonically either from above or from
below. To simplify notations, Suppose ∃{sn }n≥1 ↑ s∗ . ( The other case {sn }n≥1 ↓ s∗ is similar.)
By left continuity of Xt , there exists δ > 0 such that s ∈ (s∗ − δ, s∗ ) implies |Xs − Xs∗ − | < ε/3 and
|Xs− − Xs∗ − | < ε/3. However for sn ∈ (s∗ − δ, s∗ ),
2ε
|Xsn − Xs∗ − | = |Xsn − − Xs∗ − + 4Xsn | ≥ |4Xsn | − |Xsn − − Xs∗ − | > (22)
3
This is a contradiction and the claim is shown.
b) By a), for each n there is a finitely many jumps of size larger than 1/n. But J = {s ∈ [0, t] :
|4Xs | > 0} = ∪∞n=1 {s ∈ [0, t] : |4Xs | > 1/n}. We see that cardinality of J is countable.
18. By corollary to theorem 36 and theorem 37, we can immediately see that J ε and Z − J ε are
Lévy processes. By Lévy -Khintchine formula (theorem 43), we can see that ψJ ε ψZ−J ε = ψZ . Thus
J ε and Z − J ε are independent. ( For an alternative rigorous solution without Lévy -Khintchine
formula, see a proof of theorem 36 .)
19. Let Tn = inf{t > 0 : |Xt | > n}. Then Tn is a stopping time. Let Sn = Tn 1{X0 ≤n} . We have
{Sn ≤ t} = {Tn ≤ t, X0 ≤ n} ∪ {X0 > n} = {Tn ≤ t} ∪ {X0 > n} ∈ Ft and Sn is a stopping time.
Since X is continuous, Sn → ∞ and X Sn 1{Sn >0} ≤ n, ∀n ≥ 1. Therefore X is locally bounded.
20. Let H be an arbitrary unbounded random variable. (e.g. Normal) and let T be a positive
random variable independent of H such that P (T ≥ t) > 0, ∀t > 0 and P (T < ∞) = 1. (e.g.
Exponential). Define a process Zt = H1{T ≥t} . Zt is a càdlàg process and adapted to its natural
filtration with Z0 = 0. Suppose there exists a sequence of stopping times Tn ↑ ∞ such that Z Tn is
bounded by some Kn ∈ R. Observe that
(
Tn H Tn ≥ T > t
Zt = (23)
0 otherwise
6
R1
21. a) let a = (1 − t)−1 ( t Y (s)ds) and Let Mt = Y (ω)1(0,t) (ω) + a1[t,1) (ω). For arbitrary
B ∈ B([0, 1]), {ω : Mt ∈ B} = ((0, t) ∩ {Y ∈ B}) ∪ ({a ∈ B} ∩ (t, 1)). (0, t) ∩ {Y ∈ B} ⊂ (0, t) and
hence in Ft . {a ∈ B} ∩ (t, 1) is either (t, 1) or ∅ depending on B and in either case in Ft . Therefore
Mt is adapted.
Pick A ∈ Ft . Suppose A ⊂ (0, t). Then clearly E(Mt : A) = E(Y : A). Suppose A ⊃ (t, 1). Then
µ Z 1 ¶
1
E(Mt : A) = E(Y : A ∩ (0, t)) + E Y (s)ds : (t, 1)
1−t t
Z 1 (24)
=E(Y : A ∩ (0, t)) + Y (s)ds = E(Y : A ∩ (0, t)) + E(Y : (t, 1)) = E(Y : A).
t
c) From b), Mt (ω) = Y (ω)1(0,t) (ω) + 1/(1 − α)−1 Y (t)1(t,1) (ω). Fix ω ∈ (0, 1). Since 0 < α < 1/2,
Y /(1 − α) > Y and Y is a increasing on (0, 1). Therefore,
µ ¶ µ ¶
Y (t) Y (ω) Y (ω)
sup Mt = sup ∨ sup Y (ω) = ∨ Y (ω) = (25)
0<t<1 0<t<ω 1 − α ω≤t<1 1−α 1−α
For each ω ∈ (0, 1), Mt (ω) = Y (ω) for all t ≥ ω and especially M∞ (ω) = Y (ω). Therefore
1 1
k sup M kL2 = kY kL2 = kM∞ kL2 (26)
t 1−α 1−α
1
R1 √
22. a) By simple computation, 1−t t s−1/2 ds = 2/(1 + t) and claim clearly holds.
b) Since T is a stopping time, {T > ε} ∈ Fε and hence {T > ε} ⊂ (0, ε] or {T > ε} ⊃ (ε, 1) for
any ε > 0. Assume {T > ε} ⊂ (0, ε] for all ε > 0. Then T (ω) ≤ ε on (ε, 1) for all ε > 0 and it
follows that T ≡ 0. This is contradiction. Therefore there exists ε0 such that {T > ε0 } ⊃ (ε0 , 1).
Fix ε ∈ (0, ε0 ). Then {T > ε} ⊃ {T > ε0 } ⊃ (ε0 , 1). On the other hand, since T is a stopping time,
{T > ε} ⊂ (0, ε] or {T > ε} ⊃ (ε, 1). Combining these observations, we know that {T > ε} ⊃ (ε, 1)
for all ε ∈ (0, ε0 ). ∀ε ∈ (0, ε0 ), there exists δ > 0 such that ε − δ > 0 and {T > ε − δ} ⊃ (ε − δ, 1).
Especially T (ε) > ε − δ. Taking a limit of δ ↓ 0, we observe T (ε) ≥ ε on (0, ε0 ).
d) By a), |Mt (ω)| ≤ ω −1/2 ∨ 2 and M has bounded path for each ω. If MT 1{T >0} were a bounded
random variable, then MT would be bounded as well since M0 = 2. However, from c) MT ∈ / L2
unless T ≡ 0 a.s. and hence MT 1{T >0} is unbounded.
23. Let M be a positive local martingale and {Tn } be its fundamental sequence. Then for t ≥ s ≥
0, E(MtTn 1{Tn >0} |Fs ) = MsTn 1{Tn >0} . By applying Fatou’s lemma, E(Mt |Fs ) ≤ Ms a.s. Therefore
7
a positive local martingale is a supermartingale. By Doob’s supermartingale convergence theorem,
positive supermartingale converges almost surely to X∞ ∈ L1 and closable. Then by Doob’s
optional stopping theorem E(MT |FS ) ≤ MS a.s. for all stopping times S ≤ T < ∞. If equality
holds in the last inequality for all S ≤ T , clearly M is a martingale since deterministic times
0 ≤ s ≤ t are also stopping time. Therefore any positive honest local martingale makes a desired
example. For a concrete example, see example at the beginning of section 5, chapter 1. (p. 37)
Next We show GT − ⊂ F 1 . G0 ⊂ F 1 , since Z0T − = Z0 . Fix t > 0. We want to show that for all
A ∈ Gt , A ∩ {t < T } ∈ F 1 . Let Λ = {A : A ∩ {t < T } ∈ F 1 }. Let
Π = {∩ni=1 {Zsi ≤ xi } : n ∈ N+ , 0 ≤ si ≤ t, xi ∈ R} ∪ N (27)
Then Π is a π-system and σ(Π) = Gt . Observe N ⊂ F 1 and
(∩ni=1 {Zsi ≤ xi }) ∩ {t < T } = (∩ni=1 {ZsTi− ≤ xi }) ∩ {t < T } ∈ F 1 , (28)
Π ⊂ Λ. By Dynkin’s theorem (π − λ theorem), Gt = σ(Π) ⊂ Λ hence the claim is shown.
Observe that L is a λ-system (Dynkin’s system) and contains all the null set since so does G∞ . Let
\ n
C= {Ztj ∈ Bj } : n ∈ N, tj ∈ [0, ∞), Bj ∈ B(R) . (30)
j=1
Then C is a π-system such that σ(C) ∨ N = G∞ . Therefore by Dynkin’s theorem, provided that
C ⊂ L, σ(C) ⊂ L and thus G ⊂ L. For arbitrary A ∈ GT ⊂ G∞ , 1A = E[1A |GT ] = E[1A |H] ∈ H and
hence A ∈ H.
8
For this, let t0 = 0 and tn+1 = ∞ and write
·Y
n ¯ ¸ n+1
X Y ·Y ¯ ¸
¯ ¯
E 1{Ztj ∈Bj } ¯GT = 1{T ∈[tk−1 ,tk )} 1{Zt ∧T ∈Bj } E 1{Zt ∨T ∈Bj } ¯GT .
j j
j=1 k=1 j<k j≥k
∆ Q
Let ξ = j≥k 1{Zt ∈Bj } ∈ G∞ . Then by the strong Markov property of Z,
j ∨T −T
·Y ¯ ¸ · ¯ ¸
¯ ¯
E 1{Zt ∨T ∈Bj }
¯GT = E ξ ◦ θT ¯GT = EZT [ξ]. (32)
j
j≥k
This verifies (31) and completes the proof. (This solution is by Jason Swanson).
c) Since GT − ⊂ GT and ZT 1{T <∞} ∈ GT , σ{GT − , ZT } ⊂ GT if T < ∞ a.s. To show the converse,
observe that ZtT = ZtT − + 4ZT 1{t≥T } . Since ZtT − , ZT − , 1{t≥T } ∈ GT − for all t ≥ 0, ZtT ∈
σ(GT − , ZT ) for all t ≥ 0. Therefore GT = σ(GT − , ZT ).
25. Let Z be a Lévy process. By definition Lévy process is continuous in probability, i.e. ∀t,
limn P (|Zt − ZT −1/n | > ε) = 0. ∀ε > 0, ∀t > 0, {|4Zt | > ε} = ∪n ∩n≥m {|Zt − ZT −1/n | > ε}.
Therefore,
26. To apply results of exercise 24 and 25, we first show following almost trivial lemma.
Lemma Let T be a stopping time and t ∈ R+ . If T ≡ t, then GT = Gt and GT − = Gt− .
Fix t > 0. By exercise 25, Zt = Zt− a.s.. By exercise 24 (d), Gt = Gt− since t is a bounded stopping
time.
28. Observe that the equation in theorem 38 depends only on the existence of a sequence of simple
P Λ P
functions approximation f 1Λ ≥ 0 and a convergence of both sides in E{ j aj Nj j } = t j aj ν(Λj ).
For this, f 1Λ ∈ L1 is enough. (Note that we need f 1Λ ∈ L2 to show the second equation.)
9
29. Let Mt be a Lévy process and local martingale. Mt has a representation of the form
Z Z
Mt = Bt + x [Nt (·, dx) − tν(dx)] + αt + xNt (·, dx) (34)
{|x|≤1} {|x|>1}
First two terms are martingales. Therefore WLOG, we can assume that
Z
Mt = αt + xNt (·, dx) (35)
{|x|>1}
Mt has only finitely many jumps on each interval [0, t] by exercise 17 (a). Let {Tn }n≥1 be a sequence
of jump times of Mt . Then P (Tn < ∞) = 1, ∀n and Tn ↑ ∞. We can express Mt by a sum of
compound Poisson process and a drift term (See Example in P.33):
∞
X
Mt = Ui 1{t≥Ti } − αt. (36)
i=1
By exercise 24,
Therefore,
E|MSn ∧T1 | =E|U1 1{Sn ≥T1 } − α(Sn ∧ T1 )| ≥ E|U1 1{Sn ≥T1 } | − αE|Sn ∧ T1 |
=E[E(|U1 |1{Sn ≥T1 } |σ(T1 ) ∨ N )] − αE|Sn ∧ T1 |
(39)
=E[1{Sn ≥T1 } E(|U1 ||σ(T1 ) ∨ N )] − αE|Sn ∧ T1 |
≥E|U1 |P (Sn ≥ T1 ) − αET1 = ∞
30. Let Tz = inf{s > 0 : Zt ≥ z}. Then Tz is a stopping time and P (Tz < ∞) = 1. Let’s define a
coordinate map ω(t) = Zt (ω). Let R = inf{s < t : Zs ≥ z}. We let
( (
1 s ≤ t , ω(t − s) < z − y 1 s ≤ t , ω(t − s) > z + y
Ys (ω) = , Ys0 (ω) = (40)
0 otherwise 0 otherwise
10
So that
( (
1 R ≤ t , Zt < z − y 1 R ≤ t , Zt > z + y
YR ◦ θR (ω) = , YR0 ◦ θR (ω) = (41)
0 otherwise 0 otherwise
By taking expectation,
11
Chapter 2. Semimartingales and Stochastic Integrals
1. Let x0 ∈ R be a discontinuous point of f . Wlog, we can assume that f is a right continuous
function with left limit and 4f (x0 ) = d > 0. Since inf t {Bt = x0 } < ∞ and due to Strong Markov
property of Bt , we can assume x0 = 0. Almost every Brownian path does not have point of decrease
(or increase) and it is a continuous process. So B· visit x0 = 0 and changes its sign infinitely many
times on [0, ²] for any ² > 0. Therefore, Yt = f (Bt ) has infinitely many jumps on any compact
interval almost surely. Therefore,
X
(4Ys )2 = ∞ a.s.
s≤t
4. Assume that f (0) = 0. Let Mt = Bf (t) and Gt = Ff (t) where B· is a one-dimensional standard
Brownian motion and Ft is a corresponding filtration. Then
£ ¤
E[Mt |Gs ] = E Bf (t) |Ff (s) = Bf (s) = Ms
If f (0) > 0, then we only need to add a constant process At to Bf (t) such that 2At M0 +A2t = −Bf2(0)
for each ω to get a desired result.
12
6. Pick arbitrary t0 > 1. Let Xtn = 1(t0 − 1 ,∞) (t) for n ≥ 1, Xt = 1[t0 ,∞) , Yt = 1[t0 ,∞) . Then
n
X n , X, Y are finite variation processes and Semimartingales. limn Xtn = Xt almost surely. But
lim[X n , Y ]t0 = 0 6= 1 = [X, Y ]t0
n
7. Observe that
[X n , Z] = [H n , Y · Z] = H n · [Y, Z], [X, Z] = [H, Y · Z] = H · [Y, Z]
and [Y, Z] is a semimartingale. Then by the continuity of stochastic integral, H n → H in ucp
implies, H n · [Y, Z] → H · [Y, Z] and hence [X n , Z] → [XZ] in ucp.
Therefore,
[X, X] = [M, M ] + 2[M, A] + [A, A] = [M, M ]
13
10. X 2 is P -semimartingale and hence Q-semimartingale by Theorem 2. By Corollary of theorem
15, (X− · X)Q is indistinguishable form (X− · X)P . Then by definition of quadric variation,
[X, X]P = X 2 − (X− · X)P = X 2 − (X− · X)Q = [X, X]Q
up to evanescent set.
11. a) Λ = [−2, 1] is a closed set and B has a continuous path, by Theorem 4 in chapter 1,
T (ω) = inf t : Bt ∈
/ Λ is a stopping time.
b) Mt is a uniformly integrable martingale since Mt = E[BT |Ft ]. Clearly M is continuous.
Clearly N is a continuous martingale as well. By Theorem 23, [M, M ]t = [B, B]Tt = t ∧ T and
[N, N ]t = [−B, −B]Tt = t ∧ T . Thus [M, M ] = [N, N ]. However P (Mt > 1) = 0 6= P (Nt > 1). M
and N does not have the same law.
P
12. Fix t ∈ R+ and ω ∈ Ω. WLOG we can assume that X(ω) has a càdlàg path and s≤t |4Xs (ω)| <
∞. Then on [0, t], continuous part of X is bounded by continuity and jump part of X is bounded
by hypothesis. So {Xs }s≤t is bounded.
P Let K ⊂ R be K = [inf s≤t Xs (ω), sups≤t Xs (ω)]. Then f ,
f , f ” is bounded on K. Since s≤t {f (Xs ) − f (Xs− ) − f 0 (Xs− )4Xs }Pis an absolute convergent
0
series, (see proof of Ito’s formula (Theorem 32), it suffices to show that s≤t {f 0 (Xs− )4Xs } < ∞.
By hypothesis,
X¯ ¯ X
¯f 0 (Xs− )4Xs ¯ ≤ sup |f 0 (x)| |4Xs | < ∞
s≤t x∈K s≤t
Since this is true for all t ∈ R+ and almost all ω ∈ Ω, the claim is shown.
13. By definition, stochastic integral is a continuous linear mapping JX : Sucp → Ducp . (Section
4). By continuity, H n → H under ucp topology implies H n · X → H · X under ucp topology.
14. Let  = 1 + A to simplify notations. Fix ω ∈ Ω and let Â−1 · X = Z. Then Z∞ (ω) < ∞ by
hypothesis and  · Z = X. Applying integration by parts to X and then device both sides by Â
yields
Z
Xt 1 t
= Zt − Zs dÂs
  0
Since Z∞ < ∞ exists, for any ² > 0, there exists τ such that |Zt − Z∞ | < ε for all t ≥ τ . Then for
t > τ,
Z t Z τ Z t
1 1 1 Ât − Âτ
Zs dÂs = Zs dÂs + (Zs − Z∞ )dÂs + Z∞
Ât 0 Ât 0 Ât τ Ât
Let’s evaluate right hand side. As t → ∞, the first term goes to 0 while the last term converges
to Z∞ by hypothesis. By construction of τ , the second term is smaller than ε(At − Aτ )/At , which
converges to ε. Since this is true for all ε > 0,
Z
1 t
lim Zs dÂs = Z∞
t→∞ Â 0
14
15. If M0 = 0 then by Theorem 42, there exists a Brownian motion B such that Mt = B[M,M ]t .
By the law of iterated logarithm,
Mt B[M,M ]t Bτ
lim = lim = lim =0
t→∞ [M, M ]t t→∞ [M, M ]t τ →∞ τ
17. Since integral is taken over [t − 1/n, t] and Y is adapted, Xtn is also an adapted process. For
t > s ≥ 1/n such that |t − s| ≤ 1/n,
¯Z Z t−1/n ¯
¯ t ¯
n n ¯ ¯
|Xt − Xs | = n ¯ Yτ dτ − Yτ dτ ¯ ≤ 2n(t − s) sup |Yτ |
¯ s s−1/n ¯ s− 1 ≤τ ≤t
n
Since X is constant on [0, t] and [1, ∞), let π be a partition of [t, 1] and M = sups− 1 ≤τ ≤t |Yτ | < ∞.
n
Then for each n, total variation of X n is finite since
X
sup |Xtni+1 − Xtni | ≤ 2nM (ti+1 − ti ) = 2nM (1 − t)
π π
18. Bt has a continuous path almost surely. Fix ω such that Bt (ω) is continuous. Then limn→∞ Xtn (ω) =
Bt (ω) by Exercise 17. By Theorem 37, the solution of dZtn = Zs−n dX n , Z = 1 is
s 0
µ ¶
1
Ztn = exp Xtn − [X n , X n ]t = exp(Xtn ),
2
since X n is a finite variation process. On the other hand, Zt = exp(Bt − t/2) and
19. An is a clearly càdlàg and adapted. By periodicity and symmetry of triangular function,
Z π Z π Z π Z π
2
n 1 2 1 2n 2
|dAs | = |d sin(ns)| = · n d(sin(ns)) = d(sin(s)) = 1
0 n 0 n 0 0
15
20. Applying Ito’s formula to u ∈ C 2 (R3 − {0}) and Bt ∈ R3 \{0}, ∀t, a.s.
Z t Z t X 3 Z t
1 1
u(Bt ) = u(x) + ∇u(Bs ) · dBs + 4u(Bs )ds = + ∂i u(Bs )dBsi
0 2 0 kxk 0
i=1
and Mt = u(Bt ) is a local martingale. This solves (a). Fix 1 ≤ α ≤ 3. Observe that E(u(B0 )α ) <
∞. Let p, q be a positive number such that 1/p + 1/q = 1. Then
Z
x α 1 1 kx−yk2
− 2t
E (u(Bt ) ) = α 3/2
e dy
R3 kyk (2πt)
Z Z
1 1 kx−yk2
− 2t 1 1 kx−yk2
− 2t
= α 3/2
e dy + α 3/2
e dy
{B(0;1)∩B c (x;δ)} kyk (2πt) R3 \{B(0;1)∩B c (x;δ)} kyk (2πt)
Z
1 kx−yk2
− 2t 1
≤ sup 3 e · α
dy
y∈{B(0;1)∩B c (x;δ)} (2πt) 2 B(0;1) kyk
ÃZ !1/p ÃZ µ ¶q !1/q
1 1 kx−yk2
+ αp
dy 3/2
e− 2t dy
R3 \{B(0;1)∩B c (x;δ)} kyk R3 \{B(0;1)∩B c (x;δ)} (2πt)
Pick p > 3/α > 1. Then the first term goes to 0 as t → ∞. In particular it is finite for all t ≥ 0.
The first factor in the second term is finite while the second factor goes to 0 as t → ∞ since
Z kx−yk2
Z kx−yk2
1 1 − 2t/q 1 1 1 − 2t/q 1 1
3/2 3(q−1)/2
e dy = 3(q−1)/2 3/2 3/2
e dy = 3(q−1)/2 3/2
(2πt) (2πt) (2πt) q (2πt/q) (2πt) q
and the second factor is finite for all t ≥ 0. (b) is shown with α = 1. (c) is shown with α = 2. It
also shows that
lim E x (u((Bt )2 ) = 0
t→∞
16
22. Claim that for all integer k ≥ 1 and prove by induction.
Z ∞ Z ∞ Z ∞
k
(A∞ − As ) = k! dAs1 dAs2 . . . dAsp (48)
s s1 sp−1
For k = 1, equation (48) clearly holds. Assume that it holds for k = n. Then
Z ∞ Z ∞ Z ∞ Z ∞
(k + 1)! dAs1 dAs2 . . . dAsk+1 = (k + 1) (A∞ − As1 )dAs1 = (A∞ − As )k+1 , (49)
s s1 sp−1 s
24. a)
Z t Z t Z t
1 1 1 B1 − Bs
dβs = dBs − ds
0 1−s 0 1−s 0 1−s 1−s
Z t Z t Z t
1 1 Bs
= dBs − B1 2
ds + ds
0 1 − s 0 (1 − s) 0 (1 − s)2
Z t Z t Z t µ ¶
1 1 1
= dBs − B1 2
ds + Bs d
0 1−s 0 (1 − s) 0 1−s
· ¸ µ ¶
Bt 1 1 1
= − , B − B1 −
1−t 1−s t 1−t 1
Bt − B1
= + B1
1−t
Arranging terms and we have desired result.
Rs
b) Using integration by arts, since [1 − s, 0 (1 − u)−1 dβu ]t = 0,
Z t Z t Z tZ s
1 1 1
Xt = (1 − t) dβs = (1 − s) dβs + dβu (−1)ds
0 1−s 0 1−s 0 0 1−u
Z tµ ¶
Xs
=βt + − ds,
0 1−s
d(eαt Xt ) = αeαt Xt dt + eαt dXt = αeαt Xt dt + eαt (−αXt dt + σdBt ) = σeαt dXt
17
28. By the law of iterated logarithm, lim supt→∞ Btt = 0 a.s. In particular, for almost all ω there
exists t0 (ω) such that t > t0 (ω) implies Bt (ω)/t < 1/2 − ² for any ² ∈ (0, 1/2). Then
½ µ ¶¾
Bt 1
lim E(Bt ) = lim exp t − ≤ lim e−²t = 0, a.s.
t→∞ t→∞ t 2 t→∞
29. E(X)−1 = E(−X + [X, X]) by Corollary of Theorem 38. This implies that E(X)−1 is the
solution to a stochastic differential equation,
Z t
−1
E(X)t = 1 + E(X)−1s− d(−Xs + [X, X]s ),
0
which is the desired result. Note that continuity assumed in the corollary is not necessary if we
assume 4Xs 6= −1 instead so that E(X)−1 is well defined.
Rt
30. a) By Ito’s formula and continuity of M , Mt = 1 + 0 Ms dBs . Bt is a locally square
integrable local martingale and M ∈ L. So by Theorem 20, Mt is also a locally square integrable
local martingale.
b)
Z t Z t
[M, M ]t = Ms2 ds = e2Bs −s ds
0 0
c) EeBt is calculated above using density function. Alternatively, using the result of b),
t t t
EeBt = E(Mt e 2 ) = e 2 EM0 = e 2
31. Pick A ∈ F such that P (A) = 0 and fix t ≥ 0. Then A ∩ {Rt ≤ s} ∈ Fs since Fs contains all
P -null sets. Then A ∈ Gt = FRt . If tn ↓ t, then by right continuity of R, Rtn ↓ Rt . Then
by the right continuity of {Ft }t and Exercise 4, chapter 1. Thus {Gt }t satisfies the usual hypothesis.
32. If M has a càdlàg path and Rt is right continuous, M̄t has a càdlàg path. M̄t = MRt ∈ FRt =
Gt . So M̄t is adapted to {Gt }. For all 0 ≤ s ≤ t, Rs ≤ Rt < ∞. Since M is uniformly integrable
martingale, by optional sampling theorem,
So M̄ is G-martingale.
18
Chapter 3. Semimartingales and Decomposable Processes
1. Let {Ti }ni=1 be a set of predictable stopping time. For each i, Ti has an announcing sequence
{Ti,j }∞
j=1 . Let Sk := maxi Ti,k and Rk := mini Ti,k . Then Sk , Rk are stopping time. {Sk } and {Rk }
make announcing sequences of maximum and minimum of {Ti }i .
3. Let S, T be two predictable stopping time. Then S ∧ T , S ∨ T are predictable stopping time
as shown in exercise 1. In addition, Λ = {S ∨ T = S ∧ T }. Therefore without loss of generality,
we can assume that S ≤ T . Let {Tn } be an announcing sequence of T . Let Rn = Tn + n1{Tn ≥S} .
Then Rn is a stopping time since {Rn ≤ t} = {Tn ≤ t} ∩ ({t − Tn ≤ n} ∪ {Tn < S}) ∈ Ft . Rn is
increasing and lim Rn = TΛ = SΛ .
P
4. For each X ∈ L, define a new process Xn by X n = k∈N Xk/2n 1[k/2n ,(k+1)/2n ) . Since each
summand is an optional set (See exercise 6) X n is an optional process. As a mapping on the
product space, X is a pints limit of X n . Therefore X is optional. Then by the definition P ⊂ O.
5. Suffice to show that all càdlàg processes are progressively measurable. (Then we can apply
monotone class argument.) For a càdlàg process X on [0, t], define a new process X n by putting
Xun = Xk/2n for u ∈ [(k − 1)t/2n , kt/2n ), k = {1, . . . , 2n }. Then on Ω × [0, t],
· · ¶¸
k−1 k
{X n ∈ B} = ∪k∈N+ {ω : Xk/2n (ω) ∈ B} × , ∈ Ft ⊗ B([0, t]) (50)
2n 2n
7. Pick a set A ∈ FSn . Then A = A ∩ {Sn < T } ∈ FT − by theorem 7 and the definition of
{Sn }n . (Note: The proof of theorem 7 does not require theorem 6). Since this is true for all n,
∨n FSn ⊂ FT − . To show the converse let Π = {B ∩ {t < T } : B ∈ Ft }. Then Π is closed with
respect to finite intersection. B ∩ {t < T } = ∪n (B ∩ {t < Sn }). Since (B ∩ {t < Sn }) ∩ {Sn ≤ t} =
∅ ∈ Ft ,B ∩ {t < Sn } ∈ FSn . Therefore B ∩ {t < T } ∈ ∨n FSn and Π ⊂ ∨n FSn . Then by Dynkin’s
theorem, FT − ⊂ ∨n FSn .
19
8. Let S, T be stopping times such that S ≤ T . Then FSn ⊂ FT . and ∨n FSn ⊂ FT . By the
same argument as in exercise 7, FT − ⊂ ∨n FSn . Since FT − = FT by hypothesis, we have desired
result. (Note: {Sn } is not necessarily an announcing sequence since Sn = T is possible. Therefore
∨FSn 6= FT − in general.)
9. Let X be a Lévy process, G be its natural filtration and T be stopping time. Then by
exercises 24(c) in chapter 1, GT = σ(GT − , XT ). Since X jumps only at totally inaccessible time
(a consequence of theorem 4), XT = XT − for all predictable stopping time T . Therefore if T
is a predictable time, GT = σ(GT − , XT ) = σ(GT − , XT − ) = GT − since XT − ∈ GT − . Therefore a
completed natural filtration of a Lévy process is quasi left continuous.
10. As given in hint, [M, A] is a local martingale. Let {Tn } be its localizing sequence, that is
[M, A]· n is a martingale for all n. Then E([X, X]Tt n ) = E([M, M ]Tt n )+E([A, A]Tt n ). Since quadratic
T
11. By the Kunita-Watanabe inequality for square bracket processes, ([X + Y, X + Y ])1/2 ≤
([X, X])1/2 + ([Y, Y ])1/2 , It follows that [X + Y, X + Y ] ≤ 2 ([X, X] + [Y, Y ]). This implies that
[X + Y, X + Y ] is locally integrable and the sharp bracket process hX + Y, X + Y i exists. Since
sharp bracket process is a predictable projection (compensator) of square bracket process, we obtain
polarization identity of sharp bracket process from one of squared bracket process. Namely,
1
hX, Y i = (hX + Y.X + Y i − hX, Xi − hY, Y i) (51)
2
Then the rest of the proof is exactly the same as the one of theorem 25, chapter 2, except that we
replace square bracket processes by sharp bracket processes.
12. Since a continuous finite process with finite variation always has a integrable variation, without
loss of generality wePcan assume that the value ofR A changes only by jumps. Thus A can be
·
represented as At = s≤t 4As . Assume that C· = 0 |dAs | is predictable. Then Tn = inf{t > 0 :
Ct ≥ n} is a predictable time since it is a debut of right closed predictable set. Let {Tn,m }m be an
announcing sequence of Tn for each n and define Sn by Sn = sup1≤k≤n Tk,n . Then Sn is a sequence
of stopping time increasing to ∞, Sn < Tn and hence CSn ≤ n. Thus ECSn < n and C is locally
integrable. To prove that C is predictable, we introduce two standard results.
lemma Suppose that A is the union of graphs of a sequence of predictable times. Then there
exists a sequence of predictable times {Tn } such that A ⊂ ∪n [Tn ] and [Tn ] ∩ [Tm ] = ∅ for n 6= m.
Let {Sn }n be a sequence of predictable stopping times such that A ⊂ ∪n [Sn ]. Put T1 = S1 and for
n ≥ 2, Bn = ∩n−1k=1 [Sk 6= Sn ], Tn = (Sn )Bn . Then Bn ∈ FSn − , Tn is predictable, [Tn ] ∩ [Tm ] = ∅
when n 6= m, and A = ∪n≥1 [Tn ]. (Note: By the definition of the graph, [Tn ] ∩ [Tm ] = ∅ even if
P (Tn = Tm = ∞) > 0 as long as Tn and Tm are disjoint on Ω × R+ )
20
lemma Let Xt be a càdlàg adapted process and predictable. Then there exists a sequence of
strictly positive predictable times {Tn } such that [4X 6= 0] ⊂ ∪n [Tn ].
1/k 1/k
Proof. Let Sn+1 = inf{t : t > Tn (ω), |XS 1/k − Xt | > 1/k or |XS 1/k − Xt− | > 1/k}. Then since
n n
X is predictable, we can show by induction that {Sn }n≥1 is predictable stopping time. In addition
1/k
[4X 6= 0] ⊂ ∪n,k≥1 [Sn ]. Then by previous lemma, there exists a sequence of predictable stopping
1/k
times {Tn } such that ∪n,k≥1 [Sn ] ⊂ ∪n [Tn ]. Then [4X 6= 0] ⊂ ∪n [Tn ]
Proof of the main claim: Combining two lemmas, we see that {4A P 6= 0} is the union
of a sequence of disjoint graphs of predictable stopping times. Since At = s≤t 4As is absolute
convergent
P for each ω, it is invariant with respect to the change of the order of summation. Therefore
At = n 4ASn 1{Sn ≤t} where Sn is a predictable time. 4ASn 1{Sn ≤t} is a predictable process since
Sn is predictable and ASn ∈ FSn − . This clearly implies that |4ASn |1{Sn ≤t} is predictable. Then
C is predictable as well.
Note: As for the second lemma, following more general claim holds.
lemma Let Xt be a càdlàg adapted process. Then X is predictable if and only if X satisfies
the following conditions (1). There exists a sequence of strictly positive predictable times {Tn } such
that [4X 6= 0] ⊂ ∪n [Tn ]. (2). For each predictable time T , XT 1{T <∞} ∈ FT − .
13. Let {Ti }i be a jump time of a counting process and define Si = Ti − Ti−1 . Then by corollary
to theorem 23, a compensator A is given by
X X i Z s
−1
At = φj (Sj ) + φi+1 (t − Ti ) 1{Ti ≤t<Ti+1 } , φi (s) = dFi (u), (52)
i≥1 j=1 0 Fi (u−)
where Fi (u) = P (Si > u). For each ω, it is clear that If Fi has a density such that dFi (u) = fi (u)
then At is absolutely continuous. Conversely if Fk (u) does not admit density then on [Tk−1 , Tk ),
At is not absolutely continuous.
15. By the uniqueness of Doob-Meyer decomposition, it suffices to show that Nt − µλt is a mar-
tingale. since µλt is clearly a predictable process with finite variation. Let Ct be a Poisson process
associated with Nt . Then by the independent and stationary increment property of compound
21
Poisson process,
Ct−s ∞ Ct−s
X X X
E[Nt − Ns |Fs ] = E Ui = E Ui |Ct−s = k P (Ct−s = k)
i=1 k=1 i=1
(53)
∞
X
=µ kP (Ct−s = k) = µλ(t − s)
k=1
18. There exists a disjoint sequence of predictable times {Tn } such that A = {t > 0 :
4hM, M it 6= 0} ⊂ ∪n [Tn ]. (See the discussion in the solution of exercise 12 for details.) In
addition, all the jump times of hM, M i· is a predictable time. Let T be a predictable time such
that hM, M iT 6= 0. Let N = [M, M ] − hM, M i. Then N is a martingale with finite variation
since hM, M i is a compensator of [M, M ]. Since T is predictable, E[NT |FT − ] = NT − . On the
other hand, since {Ft } is a quasi-left-continuous filtration, NT − = E[NT |FT − ] = E[NT |FT ] = NT .
This implies that 4hM, M iT = 4[M, M ]T = (4MT )2 . Recall that M itself is a martingale. So
MT = E[MT |FT ] = E[MT |FT − ] = MT − and 4MT = 0. Therefore 4hM, M iT = 0 and hM, M i is
continuous.
19. By theorem 36, X is a special semimartingale. Then by theorem 34, it has a unique
decomposition X = M + A such that M is local martingale and A is a predictable finite variation
process. Let X = N + C be an arbitrary decomposition of X. Then M − N = C − A. This implies
that A is a compensator of C. It suffices to show R t that a local martingale with finite variation
is locally integrable. Set Y = M − N and Z = 0 |dYs |. Let Sn be a fundamental sequence of
Y and set Tn = Sn ∧ n ∧ inf{t : Zt > n}. Then YTn ∈ L1 (See the proof of theorem 38) and
ZTn ≤ n + |YTn | ∈ L1 . Thus Y = M − N is has has a locally integrable variation. Then C is a sum
of two process with locally integrable variation and the claim holds.
22
20. Let {Tn } be an increasing sequence of stopping times such that X Tn is a special semimartin-
gale as shown in the statement. Then by theorem 37, Xt∧T ∗ = sups≤t |XsTn | is locally integrable.
n
Namely, there exists an increasing sequence of stopping times {Rn } such that (Xt∧T ∗ )Rn is inte-
n
grable. Let Sn = Tn ∧ Rn . Then Sn is an increasing sequence of stopping times such that (Xt∗ )Sn
is integrable. Then Xt∗ is locally integrable and by theorem 37, X is a special semimartingale.
21. Since Q ∼ P, dQ/dP > 0 and Z > 0. Clearly M ∈ L1 (Q) if and only if M Z ∈ L1 (P). By
generalized Bayes’ formula.
EP [Mt Zt |Fs ] EP [Mt Zt |Fs ]
EQ [Mt |Fs ] = = , t≥s (57)
EP [Zt |Fs ] Zs
Thus EP [Mt Zt |Fs ] = Ms Zs if and only if EQ [Mt |Fs ] = Ms .
Yt = E[Xt |Ft ] = E[Mt |Ft ] + E[At |Ft ]. Since E[E[Mt |Ft ]|Fs ] = E[Mt |Fs ] = E[E[Mt |Gs ]|Fs ]] =
E[Ms | f ils ], E[Mt |Ft ] is a martingale. Therefore
" n # n
X X
Varτ (Y ) = E |E[E[Ati |Fti ] − E[Ati +1 |Fti +1 ]|Fti ]| = E [|E[Ati − Ati +1 |Fti ]|] (59)
i=1 i=1
Thus for every τ and ti , E [|E[Ati − Ati +1 |Fti ]|] ≤ E [|E[Ati − Ati +1 |Gti ]|]. Therefore Var(X) < ∞
(w.r.t. {Gt })implies Var(Y) < ∞ (w.r.t {Ft }) and Y is ({Ft }, P)-quasi-martingale.
1/2
Lemma. Let N be a local martingale. If E([N, N ]∞ ) < ∞ or alternatively, N ∈ H1 then
N is uniformly integrable martingale.
Once we accept this lemma, it suffices to show that a local martingale [A, M ] ∈ H1 . By Kunita-
1/2 1/4 1/4
Watanabe inequality, [A, M ]∞ ≤ [A, A]∞ [M, M ]∞ . Then by Hölder inequality,
³ ´ ³ ´1 ³ ´
1/2 2
E [A, M ]1/2
∞ ≤ E [A, A]∞ E [M, M ]1/2
∞ . (61)
23
³ ´1
1/2 2
By hypothesis E [A, A]∞ < ∞. By BDG inequality and the fact that M is a bounded martin-
gale, E([M, M ]1/2 ) ≤ c1 E[M∞
∗ ] < ∞ for some positive constant c . This complete the proof.
1
24. Since we assume that the usual hypothesis holds throughout this book (see page 3), let
0
Ft = σ{T ∧ s : s ≤ t} and redefine Ft by Ft = ∩² Ft+² 0 ∨ N . Since {T < t} = {T ∧ t < t} ∈ F , T
t
is F-stopping time. Let G = {Gt } be a smallest filtration such that T is a stopping time, that is a
natural filtration of the process Xt = 1{T ≤t} . Then G ⊂ F.
For the converse, observe that {T ∧ s ∈ B} = ({T ≤ s} ∩ {T ∈ B}) ∪ ({T > s} ∩ {s ∈ B}) ∈ Fs
since {s ∈ B} is ∅ or Ω, {T ∈ B} ∈ FT and in particular {T ≤ s}∩{T ∈ B} ∈ Fs and {T > s} ∈ Fs .
Therefore for all t, T ∧ s, (∀s ≤ t) is Gt measurable. Hence Ft0 ⊂ Gt . This shows that G ⊂ F. (Note:
we assume that G satisfies usual hypothesis as well. )
25. Recall that the {(ω, t) : 4At (ω) 6= 0} is a subset of a union of disjoint predictable times and
in particular we can assume that a jump time of predictable process is a predictable time. (See the
discussion in the solution of exercise 12). For any predictable time T such that E[4ZT |FT − ] = 0,
26. Without loss of generality, we can assume that A0 = 0 since E[4A0 |F0− ] = E[4A0 |F0 ] =
4A0 = 0. For any finite valued stopping time S, E[AS ] ≤ E[A∞ ] since A is an increasing process.
Observe that A∞ ∈ L1 because A is a process with integrable variation. Therefore A {AS }S is
uniformly integrable and A is in class (D). Applying Theorem 11 (Doob-Meyer decomposition) to
−A, we see that M = A − Ã is a uniformly integrable martingale. Then
0 = E[4MT |FT − ] = E[4AT |FT − ] − E[4ÃT |FT − ] = 0 − E[4ÃT |FT − ]. a.s. (63)
27. Assume A is continuous. Consider an arbitrary increasing sequence of stopping time {Tn } ↑ T
where T is a finite stopping time. M is a uniformly integrable martingale by theorem 11 and the
hypothesis that Z is a supermartingale of class D. Then ∞ > EZT = −EAT and in particular
AT ∈ L1 . Since AT ≥ ATn for each n. Therefore by Doob’s optional sampling theorem, Lebesgue’s
dominated convergence theorem and continuity of A yields,
lim E[ZT − ZTn ] = lim E[MT − NTn ] − lim E[AT − ATn ] = −E[lim(AT − ATn )] = 0. (64)
n n n n
Therefore Z is regular.
Conversely suppose that Z is regular and assume that A is not continuous at time T . Since
A is predictable, so is A− and 4A. In particular, T is a predictable time. Then there exists an
announcing sequence {Tn } ↑ T . Since Z is regular,
0 = lim E[ZT − ZTn ] = lim E[MT − MTn ] − lim E[AT − ATn ] = E[4AT ]. (65)
n
Since A is an increasing process and 4AT ≥ 0. Therefore 4AT = 0 a.s. This is a contradiction.
Thus A is continuous.
24
31. Let T be an arbitrary Fµ stopping time and Λ = {ω : XT (ω) 6= XT − (ω)}. Then by
Meyer’s theorem, T = TΛ ∧ TΛc where TΛ is totally inaccessible time and TΛc is predictable time.
By continuity of X, Λ = ∅ and TΛ = ∞. Therefore T = TΛc . It follows that all stopping times are
predictable and there is no totally inaccessible stopping time.
32. By exercise 31, the standard Brownian space supports only predictable time since Brownian
motion is clearly a strong Markov Feller process. Since O = σ([S, T [: S, T are stopping time and S ≤
T ) and P = σ([S, T [: S, T are predictable times and S ≤ T ), if all stopping times are predictable
O = P.
35. E(Mt ) ≥ 0 and E(Mt ) = exp [Btτ − 1/2(t ∧ τ )] ≤ e. So it is a bounded local martingale and
hence martingale. If E(−M ) is a uniformly integrable martingale, there exists E(−M∞ ) such that
E[E(−M∞ )] = 1. By the law of iterated logarithm, exp(Bτ − 1/2τ )1{τ =∞} = 0 a.s. Then
h ³ τ´ i
E[E(−M∞ )] = E exp −1 − 1{τ <∞} ≤ e−1 < 1. (66)
2
This implies that E(−M ) is not a uniformly integrable martingale.
Pick an XT . Assume first that XT is bounded. Let Mt = E[XT |Ft∧T ]. Note that XT is bounded
and in particular in L1 . So this process is well defined. Then Mt is a martingale such that
MT = XT . Then XT = MT = MT − + 4MT where MT − is a left continuous process Mt− evaluated
at T . MT ∈ σ{4MT : M a martingale}. Since {Mt− } is a predictable process, MT − ∈ FT − .
Thus XT = MT ∈ FT − ∨ σ{4MT : M a martingale}. For unbounded XT , set XTn = XT 1{|XT |<n} .
Then XTn → XT a.s. while XTn ∈ FT − ∨ σ{4MT : M a martingale} for each n. Then XT ∈
FT − ∨ σ{4MT : M a martingale} and FT ⊂ FT − ∨ σ{4MT : M a martingale}.
25
http://www.springer.com/978-3-540-00313-7