2. Definition and Properties
We start by repeating some basic definitions.
Let be a probability space, and let be a filtration of sub--algebras of satisfying the usual conditions, that means:
- 1.
includes all -null sets from ;
- 2.
is right continuous, i.e.,
for each
.
Given a measure on , we say that if implies for . Similarly, is defined analogously. If both and , we say that and are equivalent, and we write .
A set is called evanescent if is a -null set. Here, we always consider stochastic processes modulo evanescent sets.
A càdlàg function is a mapping that is right-continuous and has left-hand limits at every point. From this point forward, we assume that all stochastic processes are càdlàg (at least up to evanescence).
A sequence of processes
converges to a process
H uniformly on compacts in probability (abbreviated
ucp) if, for each
,
A càdlàg, adapted process is called a semimartingale if it can be decomposed as , where M is a local martingale and A is a process of finite variation on every finite interval. The space of d-dimensional semimartingales is denoted by .
Beyond this decomposition, semimartingales can also be defined equivalently through their properties as good integrators. In this sense, a semimartingale is a process for which the integral operator is continuous with respect to certain metrics. Finally, semimartingales can also be described as topological semimartingales, whose definition relies on certain convergence properties in the semimartingale topology. These three characterizations – the classical decomposition, the good integrator perspective, and the topological semimartingale framework – are mathematically equivalent.
The predictable σ-algebra, denoted by , is the smallest -algebra on such that all left-continuous adapted processes are measurable with respect to , the Borel -algebra on .
This definition is equivalent to several other characterizations of the predictable -algebra on . For example, the predictable -algebra can also be generated by simple or elementary predictable processes, continuous adapted processes, or sets of predictable stopping times.
By , we denote the space of d-dimensional martingales, by the space of bounded d-dimensional martingales, and by the space of d-dimensional local martingales. A subscript 0, as in , further indicates that the process starts at 0, that is, almost surely for all .
In the following, we use
to denote the stochastic integral of a
d-dimensional predictable process
H with respect to a
d-dimensional semimartingale
X, as defined, for example, in [
3].
For
M a martingale and
, write
Here
denotes the norm in
. Then
is the space of martingales such that
Our definition of a sigma martingale was initially proposed by Goll et al. [
10] and subsequently adapted by some authors (for example, [
12] Definition 6.33). However, in Theorem 3, we show that the definition from [
6] and [
20] is equivalent to ours.
Definition 1 (Sigma Martingale).
-
(a)
-
A one-dimensional semimartingaleSis called a sigma martingale if there exists a sequence of sets such that:
-
(i)
for all n;
-
(ii)
;
-
(iii)
for any , the process is a uniformly integrable martingale.
Such a sequence is called a σ-localizing sequence.
-
(b)
A d-dimensional semimartingale is called a sigma martingale if each of its components is a one-dimensional sigma martingale.
-
(c)
By , we denote the set of all d-dimensional sigma martingales.
-
(d)
-
A sequence of sets is called a Σ-localizing sequence if:
-
(i)
for all n;
-
(ii)
;
-
(iii)
for any , the process is a local martingale.
We begin with some examples. First, observe that by setting , all local martingales are sigma martingales.
Theorem 1. Every local martingale is a sigma martingale.
Proof. Let M be a local martingale and a localizing sequence. Define . Since , it follows that M is also a sigma martingale. □
It turns out that , as illustrated by the following examples.
Example 1.
Let be a sequence of independent random variables with
We put
Then X is a well-defined sigma martingale but no local martingale.
First, we have to show that X is well-defined. Therefore, we define . Clearly, we have and hence . By the Borel-Cantelli lemma, we conclude and thus X is well-defined.
By setting
we obtain for each n. Since the sum is finite and all are symmetric and integrable, it is easy to see that is a localizing sequence and X a sigma martingale.
Furthermore, X is not a local martingale. In order to show that, we assume . Since X is a process with independent increments, we even have (see, for example, Medvegyev [17] Theorem 7.97). We put
and by the independence of the random variables , we obtain
with 1
By applying monotone convergence, and since the sets are pairwise disjoint, we obtain
This is a contradiction, and we conclude that X is not a local martingale.
The following example is a variant of the most prominent example for a sigma martingale that is not a local martingal. It is from Émery [
8] and mentioned in most publications about sigma martingales (see, for example, [
13] Example 9.29, [
20] the example preceding Theorem IV.34 or [
21] Example 5.2).
Example 2. Let be independent random variables with and . We put Then, X is a sigma martingale but not a local martingale.
By putting , we obtain
And it is easy to see that is a localizing sequence.
But X is not a local martingale as we encounter integrability problems. We assume , and hence, there exists a stopping time such that is a uniformly integrable martingale.
Since X is constant on , we deduce that T is constant on . There exists an such that on . Hence we have and thus
So is not a martingale, and thus X is not a local martingale.
Remark 1. It turns out that for both of the above examples, there exists an equivalent probability measure such that X is a -martingale.
For the first example, assume a probability measure such that are independent random variables with
Then we have
Furthermore, it is easy to see that we have . Hence, X is a martingale.
For the second example, assume a probability measure such that τ and ξ are independent random variables and for all and (for example, you can choose with for all ). Then we have and and hence X is a martingale.
However, in general, such a probability measure does not necessarily exist. We will illustrate that in Example 3.
The notion of the
– (or
–) localizing sequence is essentially new but is inspired by the procedure of
-localization, which was first described by Jacod and Shiryaev [
12] and Kallsen [
15]. It does simplify some proofs since the following theorem holds.
Theorem 2. Let S be a semimartingale. The following are equivalent:
-
(i)
The processSis a sigma martingale;
-
(ii)
forS, there exists a Σ-localizing sequence;
-
(iii)
forS, there exists a sequence , such that and and for any the process is a sigma martingale.
In order to prove this theorem, we need the following lemma.
Lemma 1. Let and sets, which form a countable partition of such that is a uniformly integrable martingale for any , then S is a sigma martingale.
Proof. We put then it is easy to see that is a -localizing sequence. □
Proof of Theorem 2. It suffices to prove the theorem for . (i)⇒(ii)⇒(iii) is clear, so we just have to show (iii) ⇒ (i).
Let S be a semimartingale, for which a sequence of subsets of the predictable -algebra exists such that and and for which is a sigma martingale for any .
By assumption, for every
there exists a sequence
, such that
is a uniformly integrable martingale for all
m. By defining
, we get
and
is a local martingale because stochastic integrals with bounded integrands and local martingale integrators are again local martingales. Thus for every pair
there exists a fundamental sequence
. We put
for
and all
. Now
is a uniformly integrable martingale, and so is
The sets
are subsets of the predictable
-algebra and form a countable partition of
. Thus, by lemSigmalokdef,
S is a sigma martingale. □
The following corollary is immediate.
Corollary 1. Every local sigma martingale X is a sigma martingale.
The following result shows that the set of sigma martingales is closed under stochastic integration, as opposed to the set of local martingales.
Corollary 2. Let and , then is also a sigma martingale.
Proof. Let be the components of S. Consider a -localizing sequence and define
Since
H is a predictable process, all of the
lie in the predictable
-algebra. Therefore, the sets
are predictable and we have
and
. By putting
and
, the process
is bounded and by the linearity of the integral, we obtain
Hence, since
is a local martingale,
is also a local martingale and thus
a
-localizing sequence. By Sigmalokdef,
is a sigma martingale. □
Corollary 3. For a sigma martingale S with Σ-localizing sequence and a sequence of subsets of the predictable σ algebra , which satisfies and , is also a Σ-localizing sequence.
Proof. We have
Since
is a
-localizing sequence,
is, by definition, a local martingale and since
is bounded,
is a local martingale for all
n and
is a
-localizing sequence. □
Corollary 4. The set of sigma martingales forms a vector space.
Proof. Without loss of generality, we assume
. Consider
and
with
-localizing sequences
for
X and
for
Y. By Sigmalokal
is a
-localizing sequence for both
X and
Y and we have
Since
is a vector space,
is a local martingale and thus
is a
-localizing sequence for
. Hence,
is a sigma martingale. □
We now come to one of the main statements about sigma martingales. As mentioned earlier,
for
is not necessarily a local martingale. Hence, we have a closer look at the class
and it turns out that this class corresponds exactly to the vector space of the sigma martingales. Furthermore, by proving this theorem, we also show that our definition of a sigma martingale is equivalent to the definition used in [
6] and [
20].
The theorem is mentioned in almost any publication about sigma martingales (for example, in [
20] Theorem IV.Theorem 89 or [
12] Theorem 6.4.1). Because of our different approach, the proof given here differs slightly from the one given in the above-mentioned literature.
Theorem 3. Let be a d-dimensional semimartingale. The following are equivalent:
-
(i)
The process X is a sigma martingale.
-
(ii)
There exists a strictly positive process and an -martingale with
-
(iii)
There exists a strictly positive process and a martingale with
-
(iv)
There exists a strictly positive process and a local martingale with
-
(v)
There exists a local martingale and a predictable process with
Proof. The implications (ii)⇒(iii)⇒(iv) ⇒(v) are clear.
- (i)
-
(ii) By assumption, there exists a
-localizing sequence
. It is easy to see that each martingale is locally in
(see, for example, [
20] Theorem IV.51). Hence, for each
and each
, there exists an increasing sequence of stopping times
tending to infinity, such that
. Therefore we can construct a sequence
of stopping times, such that
for all
and all
.
We choose appropriate
such that
and put
as well as
is the limit of a sequence of -martingales which is convergent in . Since is a Banach space, N is also a -martingale. Furthermore, we have and hence exists and we obtain for all
- (v)
-
(i): As every martingale is locally in
, for each
i, there exists an increasing sequence of stopping times
tending to infinity, such that
. Hence, we can construct an increasing sequence of stopping times
tending to infinity, such that
for all
and for all
We define
and hence, we have
Because of and because the process , stopped at , is a martingale, is for all n a local martingale. Therefore, there exists a -localizing sequence for X, and thus X is a sigma martingale.
□
The following lemma is a simple statement about general stochastic integration. However, to the best of our knowledge, it so far only mentioned in the still unpublished [
22]. For us, it will be helpful; therefore, we will prove it here again.
Lemma 2. Let be a sequence of local martingales, which converges to a process X in ucp. If is locally integrable, then X is a local martingale.
Proof. Without loss of generality, we assume all processes to be one-dimensional. Because of the ucp-convergence, we can conclude that
X is also càdlàg and adapted. By assuming a suitable subsequence, we can also assume that the convergence is almost surely on compact subsets, and thus
is also càdlàg and adapted. Furthermore,
is increasing, and we have
By assumption, the right-hand side is locally integrable; thus,
M is also locally integrable.
Now, we can find a sequence
such that
is a martingale for all
n and
k and
is integrable for all
k. Furthermore, because of
, we can apply the dominated convergence theorem and obtain for every bounded stopping time
Hence, is a martingale for all k; thus, we conclude that X is a local martingale. □
Sigma martingales are processes that behave “like” local martingales. It can even be shown that sigma martingales are semimartingales with vanishing drift ([
15] Lemma 2.1). It, therefore, raises the question of why they are not local martingales or what additional assumption must be made so that they are. We have a criterion available with ichs345, which allows us to prove the following simple criterion for this question. Despite this simplicity, to the best of our knowledge, it is not explicitly mentioned in the Sigma Martingale literature. However, it will be enormously helpful for this new approach.
Theorem 4. A sigma martingale X is a local martingale if and only if it is locally integrable.
Proof. Since every local martingale is a sigma martingale and locally integrable, it is enough to prove the converse.
Let
X be, without loss of generality, a locally integrable one-dimensional sigma martingale. By Theorem 3, there exists a representation
with
and
. We define
. Clearly, we have
and
is a bounded predictable process. We obtain
Since each
is bounded for all
n, we obtain
for all
and with the Dominated Convergence Theorem, we get
. Choosing a subsequence, we can assume
converges almost surely on compact subsets.
We put and N is an adapted càdlàg process. Since is left-continuous and hence locally bounded, it is also locally integrable. Since is locally integrable by assumption, is also locally integrable.
Furthermore, we have
Hence,
and therefore also
are locally integrable. Since any càdlàg process is locally integrable if its jump process is locally integrable,
and thus
are also locally integrable. Now the result follows from ichs345. □
As every continuous semimartingale is locally integrable, the following corollary is immediate.
Corollary 5. Every continuous sigma martingale is a local martingale.
Remark 2. As opposed to the criterion above, it is well known that any sigma martingale that is also a special semimartingale2 is a local martingale ([3] Corollary 12.3.20 or [20] Theorem IV.91) and it can be shown that a semimartingale is a special semimartingale if and only if its supremum process is locally integrable (see, for example, [Theorem 11.6.10][3] or [11] Theorem 8.6). Hence, we obtain that a local martingale is locally integrable if and only if its supremum process is locally integrable.
The following theorem is of principal importance in financial mathematics. It can be found in many publications on financial mathematics using the semimartingale terminology (not only is it mentioned in almost all of the publications we mentioned frequently in this work, such as [
5,
6,
7] and [
21] but it also mentioned in many textbooks dealing with the different aspects of financial mathematics such as [
9,
14,
18,
19]). However, to our knowledge, the only published proofs are the French-language original publication [
1] Corollaire 3.5 and the more recent [
4]. Theorem 4 enables us to give an alternative proof.
Theorem 5 (Ansel-Stricker). A one-sided bounded sigma martingale X is a local martingale. If X is bounded from below (resp. above), it is also a supermartingale (resp. submartingale).
Proof. Assume, without loss of generality, and X to be one-dimensional. By Theorem 3, there exists a representation with and . Proceeding analogously to the proof of Theorem 4, we find a sequence of locally bounded predictable processes from such that . Furthermore, we can assume that and almost surely on compact sets (we can always find a modification of a subsequence for which these properties hold). Since the are locally bounded, is a local martingale for all n. Hence, we can find a sequence of stopping times , such that is a martingale for all .
By Fatou’s lemma, we know that
Hence
is integrable and therefore
X locally integrable. From Theorem 4
follows.
We still have to show the supermartingale property. Suppose
,
,
and
as defined above. Since
, we know that for all
with
holds.
However, the above equation is clearly also satisfied for with , and therefore, it also holds in general.
Hence, the sequence
is positive, and we can again apply Fatou’s lemma. Then
X is locally integrable and since
holds, it follows that
Thus,
X is a supermartingale. □
In finite, discrete time, any non-negative local martingale that is bounded from below is even a martingale and not just a supermartingale. The difference in continuous time is that for all t does not imply (not even on compacts). Thus, there is no integrable pointwise majorant, which would be needed to prove the martingale property.
The following example illustrates that there are sigma martingales where no equivalent probability measure exists under which it is a local martingale.
Example 3.
Assume a two-dimensional sigma martingale with X being the process from EmeryExample and
Obviously, Y is a stopped compensated Poisson Process. As the compensated Poisson Process is a martingale (see, for example, [3] Theorem 5.5.18) and a stopped martingale is again a martingale by the Optional Stopping Theorem, Y is also a martingale. Hence,e Z is a sigma martingale that is not a local martingale, and we are going to show that no probability measure exists such that Z is a local martingale under .
To that end, let be an equivalent probability such that Y is a sigma martingale under . For each , the stopped process is bounded and since, by Theorem 5, any bounded sigma martingale is a martingale, we conclude that is a martingale. Thus, we have for all . With f being the density function of τ under and F being the cumulative distribution function of τ (that means ), we get
We derive with respect to t and obtain By putting , we get and , thus
Hence, we obtain and thus
Now, we turn to X. Our first goal is to show that we have or equivalently, for all . We proceed in a sequence of steps.
-
1.
-
We have .
For any , we define where . Then is clearly a sigma martingale. Furthermore, we have
And since for and , we conclude that is a bounded sigma martingale for all . Hence, by Theorem 5, it is a martingale and we obtain
-
2.
-
We have for all .
Let . Then we have and we get
As is not constantly zero, we conclude .
-
3.
-
We have for all .
By definition, we have . Both summands can be seen as measures on Ω and since the half-open intervals are an ∩-closed generator of the Borel σ algebra,3 these measures are uniquely determined by its values on , 4 we conclude for all and hence
Since and are equivalent probability measures, we have
Thus with (1), we obtain
almost surely and hence ξ and τ are -independent. We conclude .