Professional Documents
Culture Documents
Zhongmin QIAN
Mathematical Institute, University of Oxford
22 November 2005
ii
Contents
1 Brownian motion 1
1.1 Probability space . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Brownian motion . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2.1 The scaling property . . . . . . . . . . . . . . . . . . . 9
1.2.2 Markov property and finite-dimensional distributions . 11
1.2.3 The reflection principle . . . . . . . . . . . . . . . . . 16
1.2.4 Martingale property . . . . . . . . . . . . . . . . . . . 18
1.3 Quadratic variational processes . . . . . . . . . . . . . . . . . 20
2 It
os calculus 29
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.2 Stochastic integrals for simple processes . . . . . . . . . . . . 34
2.3 Stochastic integrals for adapted processes . . . . . . . . . . . 38
2.3.1 The space of square-integrable martingales . . . . . . 38
2.3.2 Stochastic integrals as martingales . . . . . . . . . . . 40
2.3.3 Summary of main properties . . . . . . . . . . . . . . 44
2.4 Stochastic integrals along martingales . . . . . . . . . . . . . 45
2.5 Stopping times, local martingales . . . . . . . . . . . . . . . . 46
2.5.1 Technique of localization . . . . . . . . . . . . . . . . . 49
2.5.2 Integration theory for semimartingales . . . . . . . . . 50
2.6 Itos formula . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
2.6.1 Itos formula for BM . . . . . . . . . . . . . . . . . . . 54
2.6.2 Proof of Itos formula. . . . . . . . . . . . . . . . . . . 54
2.7 Selected applications of Itos formula . . . . . . . . . . . . . . 55
2.7.1 Levys characterization of Brownian motion . . . . . . 55
2.7.2 Time-changes of Brownian motion . . . . . . . . . . . 57
2.7.3 Stochastic exponentials . . . . . . . . . . . . . . . . . 58
2.7.4 Exponential inequality . . . . . . . . . . . . . . . . . . 68
2.7.5 Girsanovs theorem . . . . . . . . . . . . . . . . . . . . 70
iii
iv CONTENTS
Introduction
A stochastic differential equation is simply a differential equation perturbed
by random noise (its intensity (t, Xt ) depends on time t and the position
Xt ), and therefore has the following form
dXt t .
= A(t, Xt ) + (t, Xt )W
dt
The central limit theorem in probability theory suggests that W t should have
normal distribution, and for the sack of simplicity (Wt )t0 should be inde-
pendent. Such random noise can be ideally modelled by Brownian motion
(Wt )t0 , a mathematical model describing random movements of pollen par-
ticles in a liquid, observed by R. Brown in 1827. The mathematical model
for Brownian motion and the description of its distribution were derived by
Albert Einstein in a short paper On the motion of small particles suspended
in liquids at rest required by the molecular-kinetic theory of heat published
in 1905, Annalen der Physik 17, 549-560. About the same time, in 1900, L.
Bachelier submitted his Ph. D. thesis in which he used Brownian motion to
model stock markets. His results were published in a paper titled Theorie
de la speculation in Ann. Sci. Ecole Norm. sup., 17 (1900), 21-86, which
is the first paper devoted to applications of Brownian motion to finance.
On the other hand, the first mathematical construction of Brownian
motion was achieved in 1923, N. Wiener published his article Differential
space, J. Math. Phys. 2, 132-174. Fruitful results and many unusual
features of Brownian motions were revealed mainly by Paul Levy in 30s -
40s. Among of them, P. Levy showed that almost surely t Wt is non-
where differentiable, and therefore the time-derivative of Brownian motion,
W t , does not exist. It is thus necessary to rewrite the previous differential
equation in differential form
which does not exist in the ordinary sense. It was Ito in 1940s who first
established an integration theory for Brownian motion, and therefore the
theory of stochastic differential equations. Among of the manifold applica-
tions and connections with PDE, one of the most remarkable applications
of Itos theory is in the theory of finance. Although Itos theory never catch
the attention of the Fields medals committee, while it was brought world-
wide recognition by awarding H. Markowitz, W. Sharpe and M. Miller the
1990 Nobel Prize, and F. Black and M. Scholes the 1997 Nobel Prize, both
in Economics.
This course covers the core part of Itos calculus: it provides the nec-
essary background in stochastic analysis for those who are interested in
stochastic modelings and their applications including the theory of finance,
stochastic control and filtering etc. Students who are majoring in (pure and
applied) analysis, differential geometry, functional analysis, harmonic anal-
ysis, mathematical physics and PDEs will find this course relevant to their
interests.
References:
Below is the list of a collection of text books and monographs on stochas-
tic differential equations and related topics. Items 3 and 4 are recommended
readings for this course.
1. N. Ikeda and S. Watanabe: Stochastic Differential Equations and
Diffusion Processes. North-Holland (1981).
2. I. Karatzas and S. E. Shreve: Brownian Motion and Stochastic Cal-
culus. Graduate Texts in Mathematics, Springer-Verlag (1988).
3. F. C. Klebaner: Introduction to Stochastic Calculus with Applica-
tions. Imperial College Press (1998).
4. B. Oksendal: Stochastic Differential Equations. 6th Edition, Univer-
sitetext. Springer (2003).
5. D. Revuz and M. Yor: Continuous Martingales and Brownian Motion.
Springer-Verlag (1991).
6. L. C. G. Rogers and D. Williams: Diffusions, Markov Processes
and Martingales, Volume 2 Ito Calculus. Cambridge Mathematical Library.
Cambridge University Press 2000.
7. S. E. Shreve: Stochastic Calculus Finance II Continuous-Time Mod-
els. Springer Finance Textbook. Springer (2004).
Web page:
http://www.maths.ox.ac.uk/qianz/ private/SDE05.htm
Chapter 1
Brownian motion
Let us first set up our working framework and introduce several notions.
X 1 (B) = { : X() B}
1
2 CHAPTER 1. BROWNIAN MOTION
If Z
E|X| |X()|P(d) <
Xt = n if Tn t < Tn+1 .
does not make sense, unless additional conditions on (Xt )t0 are imposed.
In particular, a function like suptK Xt may be not a random variable.
are measurable.
Definition 1.1.6 Two stochastic processes X = (Xt )t0 and Y = (Yt )t0
are equivalent (distinguishable) if for every t 0 we have
P { : Xt () = Yt ()} = 1 .
1. P {B0 = 0} = 1.
2. (Bt )t0 possesses independent increments: for any 0 t0 < t1 < <
tn random variables
are independent.
1 |x|2
2(ts)
p(t s, x) = e ; x Rd .
(2(t s))d/2
In other words
Since Z
p(t + s, x, y) = p(t, x, z)p(s, z, y)dz
Rd
therefore (Pt )t0 is a semigroup on Cb (Rd ). (Pt )t0 is called the heat semi-
group in Rd : if f Cb2 (Rd ), then u(t, x) = (Pt f )(x) solves the heat equation
1
+ u(t, x) = 0 ; u(0, ) = f ,
2 t
6 CHAPTER 1. BROWNIAN MOTION
P 2
where = i x2i is the Laplace operator.
The connection between Brownian motion and the Laplace operator
(hence the harmonic analysis) is demonstrated through the following iden-
tity:
|x|2
Z
1
E|Bt Bs |p = p |x|p exp dx .
2|t s| R 2|t s|
we thus have
p
( |t s|)p |x|2
Z
p p
E|Bt Bs | = |x| exp dx
2 R 2
= cp |t s|p/2
where
|x|2
Z
1 p
cp = |x| exp dx .
2 Rd 2
(2n)!
E(Bt Bs )2n = |t s|n .
2n n!
1.2. BROWNIAN MOTION 7
[
\ N 2 n
[ [ 1
H= X X j1 .
j
N 2 n
[ [ 1
Al = X X j1 .
j
n=l j=1
2n n 2 2n/8
8 CHAPTER 1. BROWNIAN MOTION
j=1
2n 2n 2n/8
n
1
= N 2 P X 1n n/8
2 2
4 4
N 2n 2n/8 E X 1n
2
2
n/8
4
n 1
= 2 N2 3
2n
1
= 3N n/2
2
so that
N[2 n
X 1
P(Al ) X jn X j1 n/8
P
n=l
j=1
2 2 n 2
X 1
3N n/2
n=l
2
3N 2 1
= l .
21 2
Therefore
!
\
P Al = lim P {Al }
n
l=1
3N 2 1
lim l
2 1 n 2
= 0.
It follows that P(H) = 0, thus P(H c ) = 1. On the other hand, by De Morgan
law
[ \ N 2n
c
\ \ 1
H = : X jn () X j1 () < n/8
N =1 l=1 n=l j=1
2 2n 2
1.2. BROWNIAN MOTION 9
and thus, if H c , then for any N , there is an l such that for any n > l
and for all j = 1, , N 2n we have
1
X jn () X j1 () < n/8 .
2 2 n 2
We may show thus that for any H c and t 0 the limit of Xs () exists
as s t along the dyadic numbers, i.e. as s t and s D. Moreover, D
is dense in [0, ), thus for any t [0, ) we may define
Bt () = lim Xs () if H c
sDt
Remark 1.2.9 To convince yourself why the law of large numbers for BM is
true, we may look at a special way t through natural numbers, namely
Bn X1 + + Xn
lim = lim
n n n n
where Xi = Bi Bi1 . Notice that (Xi ) is a sequence of independent random
variables with identical distribution N (0, 1), so that by the strong law of large
numbers
X1 + + Xn
EX1 = 0 almost surely.
n
In order to handle the general case t 0, we may write t = [t] + rt where
[t] is the integer part of t and rt [0, 1). Then
Bt Bt B[t] [t] B[t]
= +
t t t [t]
B
the second term tends to 0 since as t , [t]t 1 and [t][t] 0. To see
why
Bt B[t]
0
t
as t , we need the following Gaussian tail estimate for BM (see section
1.2.3 below)
( ) r Z
2 x2 /2
P : sup |Bt ()| R = 2 e dx
t[0,T ] R/ T
R2
2 exp for all R > 0 .
2T
1.2. BROWNIAN MOTION 11
For more detail, see D. Stroock: Probability Theory: An Analytic View, page
180-181.
and
|x y|2
1
p(t, x, y) = exp
(2t)d/2 2t
is the Gaussian kernel. To make this statement more precise, let us recall
two notions from the probability theory.
Conditional Expectation
The concept of conditional expectations is perhaps the most important one
in the theory of probability. The properties studied in the theory of proba-
bility like independence, Markov property and martingale property, can be
re-phased in terms of conditional expectation. It is a good idea to review
what you have learned about conditioning in Mods and Part A probabil-
ity courses: conditional probability P(A|B), conditional expectation E(X|A)
and conditional probability density function pY (x) the probability density
function of E(X|Y ), and relate all these notions in terms of conditional
expectations E(X|G) (see definition below).
The formal definition stated below was formulated by J.L. Doob. Let X
be an integrable or non-negative random variable valued in R on a probabil-
ity space (, F, P), and let G be a sub -algebra of F. Then the conditional
expectation E(X|G) of X given G is a random variable (unique up to almost
surely) such that
12 CHAPTER 1. BROWNIAN MOTION
E {E(X|G)1A } = E {X1A } .
E {E(X|G)Y } = E {XY }
E(X|G) = EX .
Ft0 = {Xs : s t} .
Then {Ft0 }t0 is an increasing family of sub -algebras of F, and for each
t 0, Xt is measurable with respect to Ft0 (we say then the process (Xt )t0
is adapted to {Ft0 }t0 ). Obviously, {Ft0 }t0 is the smallest increasing family
of sub -algebras that has the last property. {Ft0 }t0 is called the filtration
generated by the process X = (Xt )t0 . In general, we introduce
1.2. BROWNIAN MOTION 13
{ : Xt () B} Ft
1 |x|2
2t
p(t, x) = e
(2t)d/2
Ef (Bs , Bt ) = Ef (Bs , Bt Bs + Bs )
ZZ
= f (x1 , x2 + x1 )p(s, x1 )p(t s, x2 )dx1 dx2 .
14 CHAPTER 1. BROWNIAN MOTION
Theorem 1.2.13 For any t > s and any bounded Borel measurable function
f we have
E {f (Bt )|Fs } = Pts f (Bs ) a.s. (1.2)
where (Pt )t>0 is the heat semigroup. In particular E {f (Bt )|Fs } = E {f (Bt )|Bs }
which is called Markov property, and E {f (Bt )|Fs } equals F (Bs ) where
|xy|2
Z
1 2(ts)
F (x) = Pts f (x) f (y)e dy .
(2(t s))d/2 Rd
which is called Markov property of (Bt )t0 . Clearly we only need to prove
this for bounded continuous (and smooth) function f . For such a function,
we can show that
Nn
X
f (x + y) = lim fnk (x)gnk (y) .
n
k=1
for some functions fnk , gnk (for example, taking the Taylor expansion of
f (x + y)). Hence
To compute the conditional expectation E {f (Bt )|Bs }, we use the fact that
the pdf of (Bs , Bt ) is
p(s, x)p(t s, y x)
so that
Z Z
E {1A (Bs )f (Bt )} = 1A (x)f (y)p(s, x)p(t s, y x)dxdy
Z
= 1A (x)Pts f (x)p(s, x)dx
= E {1A (Bs )Pts f (Bs )}
as Z
Pts f (x) = f (y)p(t s, y x)dy .
Proposition 1.2.14 For any 0 < t1 < t2 < < tn , the (Rnd -valued)
random variable (Bt1 , , Btn ) has a pdf
p(t1 , x1 )p(t2 t1 , x2 x1 ) p(tn tn1 , xn xn1 )
where
1 2 /(2t)
p(t, x) = d/2
e|x|
(2t)
is a standard Gaussian pdf in Rd . That is, the joint distribution of (Bt1 , , Btn )
is given by
P {Bt1 dx1 , , Btn dxn }
= p(t1 , x1 )p(t2 t1 , x2 x1 ) p(tn tn1 , xn xn1 )dx1 dxn(1.3)
.
where the first equality follows from the fact that the Brownian motion
starting at Tb (in position b): BTb = b, runs afresh like a Brownian motion,
so that it moves with equal probability about the line y = b. The second
equality follows from 2b a = b + (b a) > b.
The above equation may be written as
P {Tb t, Bt a} = P {Tb t, Bt 2b a}
= P {Bt 2b a} ,
which can be justified by the strong Markov property of Brownian motion,
a topic that will not pursue here. Therefore
( ) Z +
1 x2
P sup Bs b, Bt a = e 2t dt ,
s[0,t] 2t 2ba
which gives us the joint distribution of a Brownian motion and its maximum
at a fixed time t. By differentiating in a and in b we conclude the following
Theorem 1.2.16 Let B = (Bt )t0 be a standard BM in R, and let t > 0.
Then the pdf of the joint distribution of random variables (sups[0,t] Bs , Bt )
is given as
( )
(2b a)2
2(2b a)
P sup Bs db, Bt da = exp dadb
s[0,t] 2t3 2t
(2c a)2
Z Z
2
= (2c a) exp dadc
2t3 {ac,cb} 2t
Z + Z c
(2c a)2
2
= (2c a) exp dadc
2t3 b 2t
Z + Z + 2
2 x
= x exp dxdc
3
2t b c 2t
Z + 2
2 x
= exp dx
2t b 2t
which is the exact distribution function of sups[0,t] Bs (the stopping time
Tb ) and leads to an exact formula for the tail probability of the Brownian
motion.
18 CHAPTER 1. BROWNIAN MOTION
resp.
E(Xt |Fs ) Xs (super-martingale)
resp.
E(Xt |Fs ) Xs (sub-martingale) .
Proposition 1.2.19 1) Each Bt is p-th integrable for any p > 0, and for
t>s
E(|Bt Bs |p ) = cp,d |t s|p/2 . (1.4)
so that
E(Bt |Fs ) = E(Bs |Fs ) = Bs
so that
||2
Mt exp ih, Bt i + t
2
is a martingale.
20 CHAPTER 1. BROWNIAN MOTION
Remark 1.2.22 Note that both sides of (1.5) are analytic in so that the
identity continues to hold for any complex vector . In particular, by replac-
ing by i we obtain that
(t s)||2
E {exp (h, Bt Bs i) |Fs } = exp
2
so that for any vector
||2
exp h, Bt i t
2
is a continuous martingale. This statement will be extended to vector fields
in R. The resulted identity is called Cameron-Martin formula.
for some d r matrix A, vector b and Levy measure (dx) of (Xt ) which is
a -finite measure on Rd \ {0} satisfying the following integrable condition
|x|2
Z
2
(dx) < + . (1.6)
Rd \{0} 1 + |x|
Proof. Indeed
n
X
EVD = E|Btl Btl1 |2
l=1
n
X
= (tl tl1 )
l=1
= t.
To prove the second formula we proceed as the following
n o
E (VD EVD )2
!2
X n
= E |Btl Btl1 |2 t
l=1
!2
X n
|Btl Btl1 |2 (tl tl1 )
= E
l=1
n
X
|Btk Btk1 |2 (tk tk1 ) |Btl Btl1 |2 (tl tl1 )
= E
k,l=1
n n
X 2 o
= E |Btl Btl1 |2 (tl tl1 )
l=1
n
X
|Btk Btk1 |2 (tk tk1 ) |Btl Btl1 |2 (tl tl1 )
+ E .
k6=l
22 CHAPTER 1. BROWNIAN MOTION
Since the increments over different intervals are independent, so that the
expectation of each product in the last sum on the right-hand side equals
the product of their expectations, which gives contribution zero, therefore
n o
E (VD EVD )2
n n
X 2 o
= E |Btl Btl1 |2 (tl tl1 )
l=1
n
X
E |Btl Btl1 |4 2(tl tl1 )|Btl Btl1 |2 + (tl tl1 )2
=
l=1
n
X
E|Btl Btl1 |4 2(tl tl1 )E|Btl Btl1 |2 + (tl tl1 )2
=
l=1
Xn
= 2 (tl tl1 )2
l=1
X
lim |Btl Btl1 |2 = t in L2 (, P)
m(D)0
l
for any t, where D runs over all finite partitions of interval [0, t], and
Therefore
X
lim |Btl Btl1 |2 = t in probability.
m(D)0
l
1.3. QUADRATIC VARIATIONAL PROCESSES 23
and therefore 2
X
lim E |Btl Btl1 |2 t = 0 .
m(D)0
l
P (limsupn An ) = 1 .
Proposition 1.3.3 Let (Bt )t0 be a standard BM in R. Then for any t > 0
we have
2n
X 2
B jn t B j1 t a.s. (1.7)
2 n t 2
j=1
as n .
0 1 2n
Dn = {0 = t < t < < t = t} .
2n 2n 2n
24 CHAPTER 1. BROWNIAN MOTION
Remark 1.3.4 Indeed the conclusion is true for monotone partitions. More
precisely, for each n let
Then
nk
X 2
Btni ,n Btni1 ,n t a.s. (1.8)
i=1
as n . Indeed, in this case, if we denote by Mn the right-hand side
of (1.8), then (M
n )n1 is a non-negative martingale with respect to the fil-
tration Gn = Btni ,n : i = 0, , nk , so (1.8) follows from the martingale
convergence theorem.
if p > 2, where sup is taking over all finite partitions of [0, 1], and
X
sup |Btl Btl1 |2 = a.s.
D l
That is to say, Brownian motion has finite p-variation for any p > 2. Indeed
almost all Brownian motion sample paths are -Holder continuous for any
< 1/2 but not for = 1/2. It follows thus that almost all Brownian
motion paths are nowhere differentiable. We will not go into a deep study
about the sample paths of BM, which are not needed in order to develop
Itos calculus for Brownian motion.
In view of these results, let us introduce the following
where D runs over all finite partitions of [0, T ]. f (t) in Rd has finite (total)
variation if it has finite 1-variation.
Mt Mt 2
X
hM it = lim l l1
m(D)0
l
exists both in probability and in the L2 -norm, where the limit takes over
all finite partitions D of the interval [0, t]. By definition, t hM it is
an adapted, continuous, increasing stochastic process (and therefore has
finite variation). The following theorem demonstrates the importance of the
adapted increasing process hM it .
26 CHAPTER 1. BROWNIAN MOTION
A0 = 0 ;
An = An1 + E(Xn Xn1 |Fn1 ), n = 1, 2, .
Then
3. By definition
therefore Mn = Xn An is a martingale.
Theorem 1.3.8 Let (Mt )t0 and (Nt )t0 be two continuous, square-integrable
martingales, and let
1
hM, N it = (hM + N it hM N it )
4
1.3. QUADRATIC VARIATIONAL PROCESSES 27
called the bracket process of M and N . Then hM, N it is the unique adapted,
continuous, variational process with initial zero, such that Mt Nt hM, N it
is a martingale. Moreover
n
X
lim (Mtl Mtl1 )(Ntl Ntl1 ) = hM, N it , in prob. (1.9)
m(D)0
l=1
These claims will be proven below after we have established Itos lemma for
Brownian motion.
28 CHAPTER 1. BROWNIAN MOTION
Chapter 2
It
os calculus
2.1 Introduction
Let B = (Bt )t0 be a standard Brownian motion in R on a complete prob-
ability space (, F, P) and let (Ft )t0 be its natural filtration, that is, for
each t 0
Ft = {Bs for s t; null events}
the history of the Brownian motion B = (Bt )t0 up to time t, in other words
the information accumulated by the Brownian motion B till time t.
We are going to define Itos integrals of the following form
Z t
Fs dBs for t 0
0
29
30 CALCULUS
CHAPTER 2. ITOS
does not work: the limit of Riemann sums does not exist. The limit exists
however in a probability sense, if for any finite partition we properly choose
ti [ti1 , ti ] and if the integrand process (Ft )t0 is adapted to the Brownian
filtration (Ft )t0 , that is, for each t 0, Ft is measurable with respect to Ft .
This approach works because both (Bt )t0 and (Bt2 t)t0 are continuous
martingales. Rt
In summary, Itos integral 0 Fs dBs of an adapted process F = (Ft )t0
(such that F is measurable with respect to the product -algebra B([0, ))
F , a condition you are advised to forget at the first reading) with respect
to the Brownian motion B = (Bt )t0 is simply defined to be the limit of
special sort of Riemann sums:
Z t X
Fs dBs = lim Fti1 (Bti Bti1 )
0 m(D)0
i
where the limit takes place in L2 -sense (with respect to the product measure
P(d)dt: you are forgiven to ignore its meaning), over finite partitions
D = {0 = t0 < t1 < < tn = t}
of the interval [0, t] such that m(D) = maxi (ti ti1 ) 0, and through
special Riemann sums X
Fti1 (Bti Bti1 )
i
i.e. sums of increments of Brownian motion B over [ti1 , ti ] multiplying the
value of F at the left-end point ti1 . The reason to choose Fti1 (Bti Bti1 )
rather then other forms is the following: only with this choice we have
E Fti1 (Bti Bti1 ) = 0 (2.1)
and
E Ft2i1 (Bti Bti1 )2 Ft2i1 (ti ti1 ) = 0 . (2.2)
It will become clear that, it is these important features that this sort of
Riemann sums converge to a martingale! These equations also imply that
not only the resulted Itos integral
Z t
Fs dBs
0
2.1. INTRODUCTION 31
Exercise 2.1.1 Prove equations (2.1) and (2.2). Indeed, since Fti1 is
Fti1 -measurable, so that
E Fti1 (Bti Bti1 ) = E E Fti1 (Bti Bti1 ) Fti1
= E Fti1 E (Bti Bti1 ) Fti1
= E Fti1 E Bti Bti1
= 0,
+Bt2i1 + ti1
= 2Bt2i1 2Bti1 E Bti | Fti1
= 0
and therefore
E Ft2i1 (Bti Bti1 )2 Ft2i1 (ti ti1 )
n o
2 2 2
= E E Fti1 (Bti Bti1 ) Fti1 (ti ti1 ) Fti1
n o
2 2 2
= E Fti1 E (Bti Bti1 ) Fti1 (ti ti1 ) Fti1
= 0.
so that Z t
Mt2 M02 =2 Ms dMs
0
which coincides with the fundamental theorem in Calculus.
Then Z t Z t
1
Fs dMs = Fs dMs + hN, M it
0 0 2
Indeed,
Z t n
X Ntl + Ntl1
Fs dMs = lim (Mtl Mtl1 )
0 m(D)0 2
l=1
n
X Atl + Atl1
+ lim (Mtl Mtl1 )
m(D)0 2
l=1
n
1 X
= lim Ntl Ntl1 (Mtl Mtl1 )
2 m(D)0
l=1
n
X
+ lim Ftl1 (Mtl Mtl1 )
m(D)0
l=1
n
1 X
+ lim Atl Atl1 (Mtl Mtl1 )
2 m(D)0
l=1
Z t
1
= hM, N it + Fs dMs .
2 0
where 0 = t0 < t1 < < ti , such that for any finite time T 0,
there are only finite many ti [0, T ]; each fi Fti i.e. fi is measurable
with respect to Fti and F is a bounded process. The space of all simple
(adapted) stochastic processes will be denoted by L0 . If F = (Ft )t0 L0 ,
then Itos integral of F against Brownian motion B = (Bt )t0 is defined as
X
I(F )t fi (Btti+1 Btti ) .
i=0
where the sum makes sense because only finite terms may not vanish. It is
obvious that I(F ) = (I(F )t )t0 is continuous, square-integrable, adapted to
(Ft )t0 .
and
E f (Mt Ms )2 |Fs = E (f (hM it hM is ) |Fs ) .
The second equality is trivial as f Fs that can be moved out from the
conditional expectation.
If k < j 1, then
j1
X
I(F )t I(F )s = fi (Bti+1 Bti )
i=k+1
+fj (Bt Btj ) + fk (Btk+1 Bs ) . (2.4)
In the first equality we have used the fact that fi Fti , and in the second
equality the martingale property of (Bt ). Similarly
E fj (Bt Btj )|Fs = 0, t > tj s, fj Ftj ,
E fk (Btk+1 Bs )|Fs = 0, tk+1 s > tk , fk Ftk Fs .
Using Lemma 2.2.1 below and the fact that both (Bt )t0 and (Bt2 t)t0
are martingales, we get
E (I(F )t I(F )s )2 Fs
j1
X
2 2 2
=E fi (ti+1 ti ) + fj (t tj ) + fk (tk+1 s) Fs
j=k+1
so that
t
Z
2
Fu2 du Fs
E (I(F )t I(F )s ) |Fs = E .
s
E MT2 hM iT = E M02 hM i0
= 0.
E MT2 = EhM iT .
(2.6)
Mt Nt hM, N it
is a martingale, where
1
hM, N it = (hM + N it hM N it )
4
called the bracket process of M and N . According to definition
k
X
hM, N it = lim Mtl Mtl1 Ntl Ntl1 in probability
m(D)0
l=1
where the limit takes over all finite partitions {0 = t0 < t1 < < tk = t}
of [0, t].
The space of square-integrable martingales Mc2 is endowed with the fol-
lowing distance
X 1 np o
d(M, N ) = E|M n N n |21 for M, N Mc2 .
2n
n=1
2.3. STOCHASTIC INTEGRALS FOR ADAPTED PROCESSES 39
M (k)T MT in L2 (, F, P)
hM (k)iT hM iT in L1 (, F, P) .
for any countable dense subset D of [0, T ], so that sup0tT |Mt | is a ran-
dom variable. For each n N, we may apply the Kolmogorov inequality
to martingale in discrete-time (see the Appendix) {MT k/2n ; FT k/2n }k0 to
obtain ( )
1
sup |MT k/2n | 2 E MT2 .
P
0k2n
as n we therefore have
( ) ( )
P sup |Mt | = lim P sup |MT k/2n |
0tT n 0k2n 1
1
E MT2 .
2
40 CALCULUS
CHAPTER 2. ITOS
By Kolmogorovs inequality
( )
1
P sup |M (k)t M (l)t | E|M (k)T M (l)T |2 ,
0tT 2
so that, for any fixed T > 0, M (k) uniformly converges to a limit M on [0, T ]
in probability and therefore there exists a stochastic process M (Mt )t0
such that
sup |M (k)t Mt | 0 in prob.
0tT
The space of all such adapted stochastic processes (Ft )t0 will be denoted by
L2 . If F L2 and F (n) L0 such that (2.7) holds, then limn I(F (n))
exists in Mc2 which is denoted again by I(F ), called the It
o integral of F =
(Ft )t0 along the Brownian motion (Bt )t0 .
The linearity of the Ito integrals F I(F ) from L2 into Mc2 , and the
martingale property of both processes (I(B)t )t0 and
Z t
I(F )2t Fs2 ds
0
Rt
are preserved for any F L2 . Thus hI(F )it = 0 Fs2 ds.
Rt
The stochastic integral I(F )t for F L2 is also denoted by 0 Fs dBs .
We may also use F.B to denote the Ito integral (I(F )t )t0 . Thus
Z t
I(F )t = (F.B)t = Fs dBs
0
and Z t
hI(F )it = hF.Bit = Fs2 ds .
0
We next describe a class of stochastic processes F = (Ft )t0 in L2 .
Let L denote the vector space of all adapted, left-continuous stochastic
processes F = (Ft )t0 such that for all T > 0
Z T
E Fs2 ds < + . (2.8)
0
Ft = lim Fs .
st
Remark 2.3.5 We should point out that some kind of measurability of ran-
dom function (t, ) Ft () is necessary in order to ensure (2.8) make
sense. Note that (2.8) may be written as
Z Z T
Fs ()2 dsP(d) < +
0
F (t, ) Ft ()
and let
nk
X
F (n)t = F0 1{0} (t) + Ftnl1 1(tnl1 ,tnl ] (t) ; for t 0 . (2.9)
l=1
Therefore L L2 .
2.3. STOCHASTIC INTEGRALS FOR ADAPTED PROCESSES 43
F (n)t Ft .
where the limit takes place over all finite partitions D of the interval [0, t].
One thing you may keep in your mind is that the class L2 of integrands
F = (Ft )t0 for which we may integrate against Brownian motion is large
enough that it includes many interesting stochastic processes. On the other
hand, there is a serious restriction on these processes, that is, we require our
integrands being adapted to the Brownian filtration: for each t 0, Ft is
measurable with respect to Ft .
Let us give some examples.
If X = (Xt )t0 is a continuous stochastic process which is adapted to
(Ft )t0 , if f is a Borel function, and if for every T > 0
Z T
E f (Xt )2 dt <
0
then the stochastic process (f (Xt ))t0 belongs to L2 . In particular, for any
Borel measurable function f such that
Z T
E f (Bt )2 dt < (2.10)
0
where
Z
1 2
2
Pt (f )(0) = d/2
f (x)2 e|x| /2t dx
(2t) d
ZR
1 2
= d/2
f ( tx)2 e|x| /2 dx .
(2) Rd
44 CALCULUS
CHAPTER 2. ITOS
and therefore Z T
E Ft2 dt < if 0 .
0
In the case > 0, then
Z T
1
E Ft2 dt < iff T < .
0 4
If F L2 , then I(F ) is called the Ito integral of F = (Ft )t0 against
Brownian motion B = (Bt )t0 , and we will denote it by
Z t
Fs dBs ; t 0 .
0
then define
X
M
I (F ) = fi (Mtti+1 Mtti ) .
i=0
1. I M (F ) Mc2 .
is a martingale.
and Z T
2
E |F (n)t Ft | d hM it 0 as n .
0
If F L2 (M ), then we define
Rt
We use either F.M or 0 Fs dMs to denote I M (F ).
If M = (Mt )t0 is a continuous, square-integrable martingale, and F =
(Ft )t0 belongs to L2 (M ), then both the stochastic integral
Z t
Fs dMs
0
and 2
Z t Z t
Fs dMs Fs2 dhM is
0 0
are martingales.
If M and N are two continuous, square-integrable martingales, and if
F L2 (M ) and G L2 (N ), then
Z t
hF.M, G.N it = Fs Gs d hM, N is .
0
FT = {A F : A {T t} Ft for all t 0}
which belongs to Fn .
T = inf{t 0 : Xt D}
XT 1{T <+} D .
Tb = inf{t 0 : Xt = b}
and ( )
sup Xt b = {Tb N } .
t[0,N ]
2.5. STOPPING TIMES, LOCAL MARTINGALES 49
It is obvious that
XtT = Xt 1{tT } + XT 1{t>T } .
If (Xt )t0 is adapted to the filtration (Ft )t0 , so are the process (XtT )t0
stopped at stopping time T and Xt 1{tT } .
Tn + as n +
hM it = hM Tn it if t Tn
is a local martingale.
Let F = (Ft )t0 be a left-continuous, adapted process such that for each
T >0 Z T
Fs2 dhM is < a.s. (2.11)
0
and define Z t
Sn = inf t 0 : Fs2 dhM is n n
0
X t = Mt + V t
where (Mt )t0 is a continuous local martingale, and (Vt )t0 is stochastic
processes with finite variation on any finite interval.
2.5. STOPPING TIMES, LOCAL MARTINGALES 51
where D runs over all finite partitions of [0, t] (for any fixed t), then
Z t
g(s)df (s)
0
where, the first term on the right-hand side is the Itos integral with respect
to local martingale M defined in probability sense, which is again a local
martingale, the second term is the usual Lebesgue-Stieltjes integral which is
defined path-wisely. Moreover
Z t X
Fs dXs = lim Ftl1 Xtl Xtl1 in probab.
0 m(D)0
l
52 CALCULUS
CHAPTER 2. ITOS
n1
X
an bn = a0 b0 + (ai+1 bi+1 ai bi )
i=0
n1
X n1
X
= bi+1 (ai+1 ai ) + ai (bi+1 bi )
i=0 i=0
The story for local martingales will be different, basically due to the fact
that, if (Mt )t0 is a continuous local martingale, then
Mt Mt 2 = hM i , in prob.
X
lim l l1 t
m(D)0
l
so that Z t
hM it = Mt2 M02 2 Ms dMs .
0
In particular
Z t
Xt2 X02 = 2 Xs dXs + hM it .
0
d Z t
X f
f (Xt ) f (X0 ) = (Xs )dXsi
x i
i=1 0
d
1 X t 2f
Z
+ (Xs )dhM i , M j is . (2.13)
2 0 xi xj
i,j=1
t d Z
1 X t 2f
Z
f (Xt ) f (X0 ) = f (Xs ).dXs + (Xs )dhM i , M j is
0 2 0 xi xj
i,j=1
54 CALCULUS
CHAPTER 2. ITOS
f f
where f = ( x 1
, , xn
) is the gradient vector field of f . As a conse-
quence, f (Xt )f (X0 ) is again a continuous semimartingale, with martingale
part
Z t d Z t
X f
f (Xs ).dMs = (Xs )dMsi
0 0 xii=1
and
Z t d
X f f
hf (X.)it = (Xs ) (Xs )dhM i , M j is .
0 i,j=1 xi xj
2.6.1 It
os formula for BM
If B = (Bt1 , , Btd )t0 is Brownian motion in Rd , then, for f C 2 (Rd , R),
then Z t Z t
1
f (Bt ) f (B0 ) = f (Bs ).dBs + f (Bs )ds .
0 0 2
Let Z t
1
Mtf = f (Bt ) f (B0 ) f (Bs )ds .
0 2
Then Mf is a local martingale for every f C 2 (Rd , R) and
Z t
f g
hM , M it = hf, gi(Bs )ds .
0
2.6.2 Proof of It
os formula.
Let us prove the Ito formula for one-dimensional case. For simplicity let us
just do it for a continuous, square-integrable martingale M = (Mt )t0 . In
this case we need to show
Z t
1 t 00
Z
0
f (Mt ) f (M0 ) = f (Ms )dMs + f (Ms )dhM is . (2.14)
0 2 0
t t
n(n 1)
Z Z
Mtn M0n =n Msn1 dMs + Msn2 dhM is .
0 2 0
which implies that (2.14) for power function xn+1 . Therefore Itos formula
is true for any polynomial, so is it for any C 2 function f due to Taylors
expansions.
2.7.1 L
evys characterization of Brownian motion
Our first application is Levys martingale characterization of Brownian mo-
tion. Let (, F, Ft , P) be a filtered probability space satisfying the usual
condition.
2.
For any i and j, the process Mti Mtj ij t is a martingale, that is,
M i , M j t = ij t.
56 CALCULUS
CHAPTER 2. ITOS
and obtain
d
!
t ||2
Z X
Zt = Z0 + Zs d 1 i Msi + s
0 2
i=1
d
t
Z
1 X
+ Zs dh 1 i M i is
2 0 i=1
d
t
||2 t
X Z Z
= 1+ 1 i Zs dMsi + Zs ds
0 2 0
i=1
Z d
t X
1
i j Zs dhM i , M j is
2 0 i,j=1
d
t
X Z
= 1+ 1 i Zs dMsi
i=1 0
2 s/2
Moreover, since |Zs | = e|| , so that for any T > 0
Z T Z T
2
E |Zs | ds = 2
e|| s ds < +
0 0
FORMULA
2.7. SELECTED APPLICATIONS OF ITOS 57
where we use the Ito integral. To find the solution to (2.15) we may try
Yt = exp(Xt + Vt )
where (Vt )t0 to be determined later is a correction term (which has finite
variation) due to the quadratic variation of X. Applying Itos formula we
obtain Z t
1 t
Z
Yt = 1 + Ys d(Xs + Vs ) + Ys dhM is
0 2 0
and therefore, in order to match the equation (2.15) we must choose Vt =
21 hM it .
Uniform integrability
To prove Novikovs theorem, we need the concept of uniform integrability
of a family of integrable random variables and a theorem about local mar-
tingales. The concept of uniform integrability was introduced to handle
the convergence of random variables in L1 (, F, P), in spirit which is very
close to what you have learned in analysis: uniform convergence, uniform
continuity.
If X is integrable: E|X| < +, then
Z
lim |X|dP = 0 .
N {|X|N }
that is, E 1{|X|N } |X| tends to zero uniformly on A as N .
for all X A.
According to the definition, we have the following.
as N .
A = {X E (X|G ) : A}
as long as P(A) .
Sufficiency. Let = supXA E|X|. By the Markov inequality
P(|X| N )
N
for any N > 0. For any > 0, there is a > 0 such that the inequality in 2
holds. Choose N = /. Then P(|X| N ) so that
Z
|X|dP
{|X|N }
for any X A.
64 CALCULUS
CHAPTER 2. ITOS
Proof. Necessity. For any > 0 there is a natural number m such that
Z
|Xn X|dP for all n > m .
2
so that Z Z Z
sup |Xn |dP |X|dP + sup |Xk |dP + .
n A A km A 2
In particular
sup E|Xn | E|X| + sup E|Xk | +
n km 2
i.e. {Xn : n 1} is bounded in L1 (, F, P). Moreover, since X, X1 , , Xm
belong to L1 , so that there is > 0 such that, if P(A) , then
Z m Z
X
|X|dP + |Xk |dP
A A 2
k=1
and therefore Z
sup |Xn |dP
n A
FORMULA
2.7. SELECTED APPLICATIONS OF ITOS 65
as long as P(A) .
Sufficiency. By Fatous lemma
Z Z
|X|dP sup |Xn |dP < +
n
P (|Xn X| ) n N .
Hence
lim E|Xn X| = 0 .
n
and limn XTn t = Xt . Since {XTn t }nZ+ (for any fixed t T ) is uni-
formly integrable, so that (Theorem 2.7.12)
lim XTn t = Xt in L1 (, F, P) .
n
66 CALCULUS
CHAPTER 2. ITOS
Therefore
n o
E {Xt |Fs } = E lim XTn t |Fs
n
= lim E {XTn t |Fs }
n
= lim XTn s = Xs .
n
Novikovs theorem
Now we are in a position to prove Novikovs theorem.
is a martingale up to time T .
1 = E {E(M )t }
1
1
(E (E(M )t )) E exp hM iT
2
for every (0, 1). Letting 1 we thus obtain
E (E(M )t ) 1
then Z t Z t
1
Xt = exp Fs dBs Fs2 ds (2.23)
0 2 0
2 2
Mt hM it Mt hM iT
2 2
2
Mt a(T )
2
so that
2
E(M )t eMt 2
a(T )
for > 0 .
that is,
By definition
so that Z t Z t Z t
hM, Zit = h dMs , Zs dNs i = Zs dhN, M is .
0 0 0
Therefore Z t
1
d hM, Zis = hN, M it .
0 Zs
72 CALCULUS
CHAPTER 2. ITOS
The proof of this theorem relies on the following several lemmata. Let
T > 0 be any fixed time.
Lemma 2.8.3 For any h L2 ([0, T ]) (functions on the interval [0, T ] which
RT
are square-integrable 0 h(s)2 ds < +) we associate with an exponential
martingale up to time T :
Z t
1 t
Z
2
M (h)t = exp h(s)dBs h(s) ds ; t [0, T ]. (2.30)
0 2 0
Then the vector space spanned by all these M (h)T
is dense in L2 (, FT , P).
and therefore
Z ( )
X 1X 2
H exp ci (Bti+1 Bti ) ci (ti+1 ti ) dP = 0
2
i i
so that Z ( )
X
H exp ci (Bti+1 Bti ) dP = 0 .
i
Since ci are arbitrary numbers, it thus follows that
Z ( )
X
H exp ci Bti dP = 0
i
74 CALCULUS
CHAPTER 2. ITOS
for any ci and ti [0, T ]. Since the left-hand is analytic in ci , so that the
last equality is true for any complex numbers ci . We may conclude that, for
any C0 (Rn ), Z
H(Bt1 , , Btn )dP = 0 . (2.31)
Indeed (2.31) can be shown via Fourier integral theorem: if C0 (Rn )
then Z
1 ihz,xi
(x) = (z)e dz
(2)n/2 Rn
where Z
1
(z) = (x)eihz,xi dx
(2)n/2 Rn
is the Fourier transform of . We thus may rewrite the left-hand side of
(2.31) via the Fourier integral formula
Z Z Z
1
X
H(Bt1 , , Btn )dP = H (z) exp i
zj Btj dzdP
(2)n/2 Rn j
Z ( Z ! )
1
X
= (z) H exp i zi Bti dP dz
(2)n/2 Rn i
= 0.
By Lemma 2.8.2, the collection of all functions like (Bt1 , , Btn ) is dense
in L2 (, FT , P), so that
Z
HGdP = 0 for any G L2 (, FT , P) .
2
R
In particular, H dP = 0 so that H = 0 almost surely.
Stochastic differential
equations
3.1 Introduction
Stochastic differential equations (SDE) are ordinary differential equations
perturbed by noises. In this course, we will only consider noises which can
be modelled by Brownian motion.
Stochastic differential equations we will consider thus have the following
form
n
j
fij (t, Xt )dBti + f0j (t, Xt )dt ; j = 1, , N
X
dXt = (3.1)
i=1
fij : [0, +) RN RN
77
78 CHAPTER 3. STOCHASTIC DIFFERENTIAL EQUATIONS
The following is a simple example of SDE for which has no strong solu-
tion, but possesses weak solutions and uniqueness in law holds.
so that Z t
t
Xt = e X0 + e(ts) dBs .
0
In particular, if n = N = 1 and X0 = x, then Xt has a normal distribution
with mean et x and variance
Z t 2
t 2 (ts)
E(Xt e x) = E e dBs
0
Z t 2
2t s
= e E e dBs
0
Z t
= 2 e2(ts) ds
0
2 2t
= e 1 .
2
If B = (Bt1 , , Btn )t0 is a Brownian motion in Rn , then the solution
Xt of the SDE:
dXt = dBt AXt dt
is called the Ornstein-Uhlenbeck process, where A 0 is a d d matrix
called the drift matrix. Hence we have
Z t
Xt = eAt X0 + e(ts)A dBs .
0
Hence Z t Z t
1 2
St = S0 exp dBs + ds .
0 0 2
In the case and are constants, then
1 2
St = S0 exp Bt + t
2
which is called the geometric Brownian motion.
Bt Wt W0 hW, N it
82 CHAPTER 3. STOCHASTIC DIFFERENTIAL EQUATIONS
and therefore Z t
Wt W0 b(s, Ws )ds = Bt
0
is a standard Brownian motion on (, F, Q). Thus
Z t
Wt = W0 + Bt + b(s, Ws )ds (3.6)
0
Theorem 3.2.1 (Cameron-Martins formula) Let b(t, x) = (b1 (t, x), , bn (t, x))
be bounded, Borel measurable functions on [0, +)Rn . Let Wt = (W 1 , , Wtn )
be a standard Brownian motion on a filtered probability space (, F, Ft , P),
and let F = {Ft , t 0}. Define probability measure Q on (, F ) by
dQ Pn R t k k 1
k=1 0 b (s,Ws )dBs 2
Pn R t k 2
k=1 0 |b (s,Ws )| ds
= e for t 0 .
dP Ft
Then (Wt )t0 under the probability measure Q is a solution to the SDE
we may show that (Xt )t0 under probability measure P is a Brownian mo-
tion. Therefore solutions to SDE (3.7) is unique in law: all solutions have
the same distribution.
3.2. EXISTENCE AND UNIQUENESS 83
1. Let Z t Z t
Mt = f (s, Zs )dBs f (s, Zs )dBs t 0 .
0 0
Then Z t 2
E sup |Ms | 4C 2 2
E Zs Zs ds
st 0
for all t 0.
2. If
Z t Z t
Nt = f (s, Zs )ds f (s, Zs )ds t 0
0 0
then Z t 2
sup |Ns |2 Ct E Zs Zs ds t 0 .
st 0
Theorem 3.2.4 Consider SDE (3.8). Suppose that fij satisfy the Lipschitz
condition:
j j
fi (t, x) fi (t, y) C|x y| (3.10)
Proof. Let us only prove the one dimensional case, the proof of multi-
dimensional case is similar. We are going to thus construct a strong solution
to SDE
dXt = f1 (t, Xt )dBt + f0 (t, Xt )dt .
We will apply Picards iteration to the corresponding integral equation
Z t Z t
Xt = + f1 (t, Xt )dBt + f0 (t, Xt )dt
0 0
Y0 (t) = ;
Z t Z t
Yn+1 (t) = + f1 (s, Yn (s))dBs + f0 (s, Yn (s))ds
0 0
86 CHAPTER 3. STOCHASTIC DIFFERENTIAL EQUATIONS
n = 0, 1, 2, . We will show that, for every T > 0, the sequence {Yn (t)}
converges to a solution Y (t) uniformly on [0, T ] almost surely. Introduce
notations
D(n)t = Yn (t) Yn1 (t) ; n = 1, 2,
and
Then
Z t Z t
D(1)t = f1 (s, )dBs + f0 (s, )ds
0 0
(x + y)2 2x2 + 2y 2
Z t
d(n + 1)t 2 C 2 (4 + s) d(n)s ds
0
Z t
2
2C (4 + T ) d(n)s ds .
0
We therefore have
2n C 2n (4 + T )n
d(n + 1)t d(1)T t.
n!
3.2. EXISTENCE AND UNIQUENESS 87
Moreover
2
d(1)t = E sup |Y1 (s) Y0 (s)|
0st
( Z s 2 )
2E sup |f1 (, )|dB
0st 0
( 2 )
Z s
+2E sup |f0 (, )|ds
0st 0
Z t Z t
2
8E f1 (, ) ds + 2tE f0 (, )2 ds
0 0
16t + 4t2 1 + E 2 .
Therefore, ( )
2 (C2 T )n
E sup |Yn+1 (t) Yn (t)| C1 .
0tT n!
By the Markov inequality
( )
1 (4C2 T )n
P sup |Yn+1 (t) Yn (t)| n C1 ,
0tT 2 n!
and Z t Z t
Zt = + f1 (s, Zs )dBsi + f0 (s, Zs )ds .
0 0
Again, by Lemma 3.2.3
Z t
2 2 2
E|Ys Zs |2 ds
E |Yt Zt | C (4 + T )
0
88 CHAPTER 3. STOCHASTIC DIFFERENTIAL EQUATIONS
E |Yt Zt |2 = 0 .
( )
x y 2
lim sup E sup |X (t) X (t)| =0. (3.12)
0 |xy|< 0tT
Z t Z t
x x
X (t) = x + f1 (s, X (s))dBs + f0 (s, X x (s))ds
0 0
and
Z t Z t
y y
X (t) = y + f1 (s, X (s))dBs + f0 (s, X y (s))ds .
0 0
3.3. MARTINGALES AND WEAK SOLUTIONS 89
Setting
(t) = E sup |X x (s) X y (s)|2 ,
0st
then we have
Z T
2 2
(T ) 3|x y| + 3C (4 + T ) (t)dt
0
Let us introduce
1 d2 d
L = (x)2 2 + b(x) (3.14)
2 dx dx
which is an elliptic differential operator of second-order. Then the previous
formula may be written as
Z t Z t
f (Xt ) f (X0 ) = (Xs )f 0 (Xs )dBs + (Lf )(Xs )ds .
0 0
If we set Z t
Mtf = f (Xt ) f (X0 ) (Lf )(Xs )dBs ,
0
then Z t
Mtf = (Xs )f 0 (Xs )dBs
0
is a martingale on (, F, Ft , P), and
Z t
f g
hM , M it = ( 2 f 0 )(Xs )ds .
0
1 d2 1
For example, if = 1 and b = 0 (in this case L= 2 dx2 = 2 ), then
(Bt )t0 itself is a strong solution to
dXt = dBt
so that
1 t
Z
Mtf
= f (Bt ) f (B0 ) (f )(Bs )ds
2 0
is a martingale under P. On the other hand, Levys martingale characteri-
zation shows that the previous property that
1 t
Z
f (Bt ) f (B0 ) (f )(Bs )ds
2 0
is martingale, completely characterizes Brownian motion. Therefore we may
believe that the martingale property of all M f should completely the distri-
bution of a solution (Xt )t0 to SDE (3.13), and hence those of weak solution
of (3.13). Thus we give
Definition 3.3.2 Let L be a linear operator on C (R). Let (Xt )t0 be) a
stochastic process on a filtered space (, F, Ft , P). Then we say that (Xt )t0
together with the probability P is a solution to the L-martingale problem, if
for every f Cb (R)
Z t
f
Mt f (Xt ) f (X0 ) Lf (Xs )ds
0
we thus have
Z t
f g
hM , M it = {L(f g) f (Lg) g (Lf )} (Xs )ds .
0
1 d2 d
L = (x)2 2 + b(x) .
2 dx dx
If (Xt )t0 on (, F, P) is a continuous process solving the L-martingale prob-
lem: for any f Cb2 (R)
Z t
Mtf = f (Xt ) f (X0 ) Lf (Xs )ds
0
Let us outline the proof only, a detailed proof will be given in next section
that handles the multi-dimensional case. To show (Xt )t0 on (, F, P) is a
weak solution, we need to construct a Brownian motion B = (Bt )t0 such
that Z t Z t
Xt = X0 + (Xs )dBs + b(Xs )ds . (3.16)
0 0
The key of the proof is to compute hXit , and the result is
D E Z t
f g
M ,M = (L(f g) f Lg gLf )(Xs )ds
t 0
Z t
f g
= 2 (Xs )ds .
0 x x
so that Z t
1
Bt = dMs
0 (Xs )
is a Brownian motion (Levys martingale characterization for Brownian mo-
tion). It is then obvious that (Xt , Bt ) satisfies the stochastic integral equa-
tion (3.16), so that (Xt )t0 is a weak solution to (3.15).
Chapter 4
Appendix: martingales in
discrete-time
93
94 CHAPTER 4. APPENDIX: MARTINGALES IN DISCRETE-TIME
{ : T () n} = nk=0 { : T () = k} .
T () = inf{n 0 : Xn () B}
(with the convention that inf = +) is a stopping time with respect to the
filtration {Fn }. T is called a hitting time.
To see why such a hitting time is really a stopping time, we observe that
{T = n} = n1 c
k=0 {Xk B } {Xn B} .
Proof. We prove this theorem for the case that {Xn } is a supermartin-
gale. Let T n (as it is bounded by our assumption). Then
n
X
E|XT | = E (|XT | : T = j)
j=0
Xn
= E (|Xj | : T = j)
j=0
Xn
E|Xj | ,
j=0
and
R1 S 1, Rj+1 Rj 1, for 1 j n 1 .
Let A FS . Then A FRj (as S Rj ). Therefore apply the first case to
Rj we have Z Z Z
XS dP XR1 dP XT dP
A A A
so that
E (1A XS ) E (1A XT ) for any A FS .
Since XS FS we may thus conclude that
XS E(XT |FS ) .
EX0 + 2E Xk .
3 sup E (|Xn |)
n
and the second inequality thus follows from the Fatou lemma.
and ( )
EX0 + 2E Xn
P sup |Xk | .
kn
XR , on {R < },
so that
{sup Xk } {XT },
kn
{sup Xk < } {T = n}.
kn
EX0 EXT
Z Z
= XT dP+ XT dP
{supkn Xk } {supkn Xk <}
( ) Z
P sup Xk + Xn dP
kn {supkn Xk <}
( ) ( )
= P sup Xk + E Xn ; sup Xk < .
kn kn
R = inf{k 0 : Yk }, T = R n.
{sup Yk } {YT } ;
kn
{sup Yk < } {T = n} .
kn
4.2. DOOBS INEQUALITIES 99
EYn EYT
Z Z
= YT dP+ YT dP
{supkn Yk } {supkn Yk <}
( ) Z
P sup Yk + Yn dP
kn {supkn Yk <}
( ) Z
= P sup Xk + Yn dP.
kn {supkn Yk <}
Therefore
( )
P sup Xk = P inf Xk
kn kn
Z
EYn Yn dP
{supkn Yk <}
Z
= Yn dP
{supkn Yk }
Z
= Xn dP.
{inf kn Xk }
and therefore
( ) Z
2
P sup Xk2 2
Xn2 dP
kn { inf kn Xk2 2 }
Z
Xn2 dP = E Xn2
.
1
P(Xn ) E {Xn ; Xn }
4.2. DOOBS INEQUALITIES 101
we thus obtain
Z
1
E {(Xn )} E {Xn ; Xn } d()
Z0 Z
1
= Xn dPd()
0 {Xn }
( Z Xn !)
1
= E Xn d() . (4.1)
0
Exercise 4.2.6 1. Prove log x x/e for all x > 0, hence prove that
b
a log+ b a log+ a + .
e
Theorem 4.2.7 (Doobs inequality) Let (Xn ) be a non-negative sub-martingale.
Then
e +
E max Xk 1 + max E Xk log Xk .
kn e1 kn
Proof. We may use the same argument as in the proof of the previous
theorem, but with the choice that () = ( 1)+ . We thus obtain (by
(4.1))
( Z Xn !)
1
E {(Xn )} E Xn d()
0
( Z Xn !)
1
= E Xn 1{Xn 1} d
1
= E Xn log+ Xn ,
102 CHAPTER 4. APPENDIX: MARTINGALES IN DISCRETE-TIME
1
E Xn log+ Xn + EXn
e
so that
1
EXn E Xn log+ Xn .
1 1/e
upcrosses the interval [a, b] j times. Denote by Uab (X; n) the number of
upcrossing [a, b] by {Xk } up to time n. Then we have
{Uab (X; n) = j} = {T2j1 n < T2j+1 } Fn .
Note that by definition,
XT2j a, on {T2j < } ;
XT2j+1 b, on {T2j+1 < } .
4.3. THE CONVERGENCE THEOREM 103
and
1
EUab (X; n) E(Xn a) .
ba
2. Similarly, if X = {Xn } is a submartingale, then
n o 1 n o
P Uab (X; n) k E (Xn a)+ : Uab (X; n) = k
ba
and
1
EUab (X; n) E(Xn a)+ .
ba
However
{Uab (X; n) k} {T2k1 n},
so that
Thus we get the first inequality. By adding up over all k 0 we get the
second inequality.
104 CHAPTER 4. APPENDIX: MARTINGALES IN DISCRETE-TIME
so that
P(W(a,b) ) = 0.
Hence P(W ) = 0. However if / W, then limn Xn () exists, and we
denote it by X () and on W we let X () = 0. Then we have Xn X
almost surely. Moreover by the Fatou lemma,
i.e. X L1 (, F, P ).
If in addition {Xn } is non-negative, then
E(X |Fn ) Xn .