James K. Brooks and David Neal. This is immediate since, by disjointness of the stochastic intervals,. XT 1(T>0). XS1(S>0) = XT^U1(T^U>0). XS^U1(S^U>0).
Ulam Quarterly | Volume 1, Number 1, 1992
A Vector Measure Approach to the Optional Stochastic Integral
James K. Brooks
Department of Mathematics University of Florida Gainesville, FL 32611
and
Introduction
David Neal
Department of Mathemtics Western Kentucky University Bowling Green, KY 42101
In this paper, we shall present a method of de ning the stochastic integral of a real-valued, optional process H with respect to certain Banachvalued processes X. We shall do so by de ning an L1 -valued measure JX on the optional -algebra O. This measure will be an extension of a measure IX de ned on the algebra of predictable rectangles (cf. [3], [9]). Many constructions of stochastic integrals are limited by their reliance on the predictable -algebra. Consequently, only predictable processes may be integrated. By de ning a measure directly on O, we overcome this problem and are able to integrate optional processes. Other extensions of stochastic integrals to optional processes have relied on the geometry of Hilbert space and have been restricted to processes X which reduce to either square integrable martingales or martingales of integrable variation. We again overcome these restrictions and require only that X be a separably-valued, adapted, cadlag process with a certain uniform integrability condition satis ed by both types of martingales above. For convenience, we also assume X1 = tlim !1 : Xt exists a.s. In a previous paper [4], we constructed an optional stochastic integral for Hilbert and/or real-valued processes H and X. For this construction, we sought a natural de nition of the integral H X, for bounded, optional H and square integrable X, in terms of a predictable stochastic integral. To this end, we let H 0 be a bounded, predictable process such that H H 0 is thin and H00 = H0. Then, H X = H 0 X + Z, where Z is a certain limit in M 2 of square integrable martingales. If H = 1[S;T [ , where S; T are stopping times with S T , then we may take H 0 = 1]S;T ] + 1(S =00) 1(T r) . We denote by (T A)P the predictable compensator of T A, which exists by [7]. For [S; T [: : S, we de ne JX ([S; T [) by JX (jS; T j) = XT 1(T>0) XS 1(S>0) [(T A)1 (T A)P1 ] + [(S A)1 (S A)P1 ]: (1) We then obtain a nitely additive measure JX : T ! L1 ( ; F; P; G) n X given by JX (B) = : JX ([Si ; Ti[), where B = [S1 ; T1[: [ [ : [Sn; Tn [. i=1 To see that JX is nitely additive, we rst note that if [S; T [; [U; V [: : S and [S; T [: \ : [U; V [= ;, then JX ([S; T [) = JX ([S ^ U; T ^ U[): +: JX ([S _ V; T _ V [):
(2)
20
James K. Brooks and David Neal
This is immediate since, by disjointness of the stochastic intervals, XT 1(T>0) XS 1(S>0) = XT ^U 1(T ^U>0) XS ^U 1(S ^U>0) + XT _V 1(T _V >0) XS _V 1(S _V >0):
(3)
Likewise, TA
: S A =: T ^U A : S ^U A+: T _V A : S _V A:
(4)
After subtracting predictable compensators and adding (3), equation (2) follows. Next if B = [S1; T1 [: [ : [ [Sn ; Tn[: : T and C = [U; V [: : S with B \ C = ;, then B [ C : : T and B [ C = [S1 ^ U; T1 ^ U[ [ [ [Sn ^ U; Tn ^ U[ [ [U; V [ [ [S1 : _ : V; T1 _ V [[ [ [Sn _ V; Tn _ V [:
(5)
Hence, by (2), JX (B [ C) =
n X i=1
JX ([Si ^ U; Ti ^ U[) + JX ([U; V [)
+ n X
n X i=1
JX ([Si _ V; Ti _ V [)
JX ([Si ; Ti[) + JX ([U; V [) i=1 = JX (B) + JX (C): =
(6)
Finally, we assume inductively that JX (B [ C) = JX (B) + JX (C), where B = [S1 ; T1[[ [ [Sn; Tn [; n arbitrary, and C = [U1; V1[[ [ [Um 1 ; Vm 1 [ with B \ C = ;. Then let D = C [ [Um ; Vm [: : T with B \ D = ;. As in (5), B : [ : [Um ; Vm [ can be written as an element in T ; hence, by the induction hypothesis on C and (6), JX (B [ D) = JX (B [ [Um ; Vm [) + JX (C) = JX (B) + JX ([Um ; Vm [) + JX (C) = JX (B) + JX (D): We say the process X is O-summable if JX has a countably additive extension to the optional -algebra O = (T ). We shall always denote the extension again by JX . The corresponding space of real-valued, Omeasurable functions de ned on R+ , which are integrable with respect
A Vector Measure Approach to the Optional Stochastic Integral
21
to JX , in the sense of Bartle-Dunford-Schwartz, will be denoted by L1(JX ). Then all bounded, optional processes are elements of L1(JX ). We note that we can in fact de ne a Banach space L1 (JX ) consisting of equivalence classes of integrable functions in L1 (JX ) (cf. [1]). The norm on this space is de ned as follows. To simplify notation, we let m = JX and B = L1 ( ; F; P; G). Then m is a B-valued countably additive measure. We denote the unit sphere in the dual of B by B1 . For every z: : B1 , we de ne mz : O ! R by mz (A) = z(m(A)) and we denote the variation of mz by j mz j. Then for HL1(m),
kH kL1(m) = sup
Z
: j H j : d: j mz j : : zB1 :
Although we shall not need to distinguish between H : : L1(m) and it's equivalence class in L1 (m), we shall make use of the following inequality:
Z E
Z
HdJX
G =
HdJX
B
kH kL1 (m):
In the same manner that we de ned JX , we can de ne a measure JXt on the optional subsets Ot of [0; t] , generated by the family St = f[S; T [:: S T tg. Then JXt takes values in L1 ( ; Ft; P; G) for all t 0. However, the restriction of JX to Ot agrees with JXt on St and thus will agree on Ot. It follows that if HL1(JX ), then H1[0;t] : : L1(JXt ) and Z
H1[0;t]d: JX =
Z
H1[0;t]d: JXt L1( ; Ft; P; G):
If there exists an adapted, cadlag process Z : R+ : R : ! G such that, for all t 0; Zt is a member of the equivalence class H1[0;t]dJX , then Z is called the R stochastic integral of H with respect to X and is denoted by H X or HdX. R It follows that HdX is unique when de ned. For if Zt and Yt are R both elements of the equivalence class H1[0;t]dJX , then Zt = Yt a.s. for each t. By right R continuity, Z and Y are indistinguishable. Furthermore, the map H ! HdX is linear. Theorem R 1.1. If X is O-summable and HL1(JX ), then the stochastic integral HdX exists. Before we prove Theorem 1, we shall prove the following lemma and discuss the relation of this stochastic integral to the predictable stochastic integral. Lemma 1.1. Assume X is O-summable. (a) For every stopping time T; JX ([T]) = 1(T =0) X0 + (T A)1 (T A)P1 .
22
James K. Brooks and David Neal
(b) For all stopping times S; T with S T; JX (]S; T ]) = XT XS . (c) For all predictable rectangles R0 = f0g E , with EF0, and R = ]s; t] F , with FFs, JX (R0 ) = 1E X0 and JX (R) = 1F (Xt Xs ). 1 Proof. (a) By countable additively, JX ([T]) = nlim !1 JX ([T; T + n [). We then note that XT + n1 1(T + n1 >0) = XT + n1 converges in L1 to XT and XT XT 1(T>0) = XT 1(T =0) . Next (T + n1 A)1 = (XT + n1 X(T + n1 ) ) converges in L1 to XT XT = 0. Lastly, if we let B n =T + n1 A, then E[j : (B n )P1 : j] E[
Z1
: j d(B n )PS : j] E[
Z1
: j dBsn : j]
n 0: = E[: j 4XT + n1 : j]! Hence, (T + n1 A)P1 also converges to 0 in L1 . The result in (a) then follows from the de nition of JX ([T; T + n1 [). (b) Part (b) follows from (a) since JX (]S; T ]) = JX ([S; T [) JX ([S]) + JX ([T]). (c) If we let T be the stopping time 0E , which is 0 on E and +1 on E c , then R0 = [T ]. Then T A 0 and hence (T A)P 0. Thus, JX (R0) = JX ([T]) = 1(T =0) X0 = 1E : X0 . The set R is equal to the stochastic interval ]sF ; tF ]; hence, by (b), JX (R) = XtF XsF = 1F (Xt Xs ). We see that when X is O-summable, JX agrees with the measure IX , de ned by Brooks and Dinculeanu in [3], on the algebra of predictable rectangles which generates the predictable -algebra P. Thus, the extension of JX to O will be an extension of IX to P; hence, X is summable in the ordinary sense of the predictable stochastic integral. It is known by [10], that if X is a real-valued process, then X is summable if and only if X is a quasimartingale bounded in L1 . If X takes values in a Banach space which does not contain a copy of co , then X is summable if and only if IX is bounded on the algebra of predictable rectangles [3]. Of course, when X is O-summable, then all other results concerning summable processes are still valid. In particular, if X is a G-valued summable quasimartingale R and H is a real-valued, bounded, predictable process, then HdX is a quasimartingale [3]. Proof of Theorem 1. We rst assume that H = 1[S;T [, where [S; T [S. Then, for a xed t 0, H1[0;t] = 1]S;T ] 1[0;t] 1[T ] 1[0;t] + 1[S ] 1[0;t] = 1]S ^t;T ^t] 1[TF ] + 1[SE ] ; where F = fw : T(w) tg and E = fw : S(w) tg: Thus, by Lemma 1, Z
H1[0;t] dJX = XT ^t XS ^t 1(TF =0) X0 + 1(SE =0) X0 ((TF A)1 (TF A)P1 ) + (SE A)1 (SE A)P1 :
A Vector Measure Approach to the Optional Stochastic Integral
23
However, TF Ar = 4XTF 1(TF >0)1(TF r) = 4XT 1(T>0)1(T t)1(T r) = (T A)t^r . Thus, TF A is simply the process T A stopped at t. Since stopping and compensation commute, (TF A)Pr = (T A)Pt^r . Hence, Z
H1[0;t]dJX = XT ^t XS ^t 1(T =0) X0 + 1(S =0)X0 ((T A)t (T A)Pt ) + (S A)t (S A)Pt : () R
Since X is an adapted, G-valued, cadlag process, we can take ( HdX)t to R be the process given in (). Thus, HdX exists for H = 1[S;T [ . Moreover, if we replace the deterministic Rtime t by an Rarbitrary stopping time U in the above argument, we see that ( HdX)U = H1[0;U ] dJX in L1 ( ; F; P; G). It follows rst that if H = 1A , Rwhere AT , and secondly if H is a simple TR-measurable Rfunction then HdX exists and for every stopping time U; ( HdX)U = H1[0;U ] dJX in L1 ( ; F; P; G). We now let HL1(JX ). Since T generates the optional -algebra O, the set of simple T -measurable functions are dense in L1 (JX ). We then choose a sequence fH ng of such functions which converge to H in L1 (JX ). By choosing an appropriate subsequence, we may assume k H n+1 H n kL1 (JX ) < 4 n, for all n. R We let Z n = : H n dX for all n. We x t0 > 0 and de ne a sequence of stopping times fUng by Un = inf ft :k : Ztn+1 Ztn : k> 2 ng ^ t0 ; and let Gn = fw : Un (w) < t0 g. Then for all n, E[:
k ZUnn+1 ZUnn :
Z
k] =k : (H n+1 H n )1[0;Un] dJX : kL1 (G) k : H n+1 H n : kL1(JX ) < 4 n:
Since Gn f : k : ZUnn+1 ZUnn : k : > 2 ng, by Tchebyshev's inequality, P(Gn) P(f : k : ZUnn+1 ZUnn : k : > 2 n g) 2n : E[: k : ZUnn+1 ZUnn : k : ] < 2 n: Hence, by the Borel-Cantelli lemma, P(Gn i.o.) = 0. Thus, for a.e. w, there exists an integer Nw such that Un (w) = t0 when n Nw . That is, for all t: : [0; t0]; k : Ztn+1 (w) Ztn (w): k : 2 n if n Nw . It follows that the sequence fZ(n) (w)g converges uniformly on [0; t0].
24
James K. Brooks and David Neal
n We then let Zt (w) = nlim !1 Zt (w). Then for a.e. w; : Z() (w) is the uniform limit on [0; t0] of a sequence of cadlag functions; hence, Z is cadlag outside of an evanescent set. Moreoever, since Z n is adapted for all n; Z is also adapted. R for each t; Ztn R= H n 1[0;t]dJX in L1 ( ; Ft; P; G) and R Finally, n Rf H 1[0;t] dJX g converges to H1R[0;t]dJX in L1 (G). It follows that Zt = H1[0;t]dJX in L1 (G); hence, Z = HdX. TheoremR1.2. Let X be O-summable and let HL1(JX ). If X is a martingale, then HdX is a martingale. Moreover, if X isRa uniformly integrable R R martingale, then so is HdX and ( HdX)t = E( HdJX j Ft) a.s. for each t 1. Proof . If H = 1[S;T [ , where [S; T [: : S, then by (*) in the proof of Theorem R T A (T A)P and S A 1, HdX is a martingale. Since the processes R (S A)P are uniformly integrable martingales, HdX will also R R be uniformly integrable when X is and then clearly E( HdJ j F ) = ( HdX)t a.s. X t R Since the map H ! HdX is linear, the theorem remains true if H = 1A, where AT , and then if H is a simple T -measurable process. If HL1(JX ), we let fH n g be a sequence of simple T - Rmeasurable pron n Rcesses which converge in L1 (JX ) to H. Then for all t 1; H 1[0;t] dJX ! H1[0;t]dJX in L1 ( ; Ft; P; G); hence, Z
H ndX
t
Z n !
HdX
t
in L1 ( ; Ft; P; G). Thus, if s t and AFs, Z Z
A
HdX dP = lim n s
= lim n R
Z Z
AZ Z A
H ndX dP s
H ndX
t
dP =
Z Z
A
HdX dP; t
hence, HdX Moreover, if X is uniformly integrable, then R is a martingale. n R HdJ in L ; thus, E(R H ndJ j F )! n E(R HdJ j for t = 1, H n dJX ! X 1 X s X R R Fs) in L1 for all s 1. It follows that ( HdX)s = E( HdJX j Fs) a.s.
x2. The Optional Integral in Hilbert Space
We adopt the same setting as the previous section; however, we now let G be a real, separable Hilbert space. We shall give an example of a process X that is O-summable. We note that, even for real-valued X, the result yields a wider class of processes than square integrable martingales for which we can de ne the optional stochastic integral. This class is the
A Vector Measure Approach to the Optional Stochastic Integral
25
space HG1 which consists of all uniformly integrable G-valued martingales 1 X such that E[X ] = E[sup j Xt j] < 1, or equivalently, E[[X; X]12 ] < 1. t
Theorem 2.1. If G is a real, separable Hilbert space and XHG , then X 1
is O-summable.
Proof. Since G is weakly sequentially complete, JX takes values in a
weakly sequentially complete space L1 ( ; F; P; G). Thus JX will have a countably additive extension to O if it is weakly countably additive and bounded on the algebra T which generates O. We let Y1 L1 (G) = (L1 (G)) and let Y be a cadlag version of the bounded martingale E(Y1 j Ft). Then Y (BMO)1G = (HG1 ) . Then for all [S; T [S; Y1 (JX ([S; T [)) = E[hY1 ; JX ([S; T [)i] = E[hY1 ; 1(S =0