Formalization of a Brownian motion and of stochastic integrals in Lean

7 Filtrations, processes and martingales

7.1 Basic definitions

First, recall the definitions of a filtration, an adapted process, a (sub)martingale, a stopping time and a stopped process, which are already in Mathlib.

Definition 7.1 Filtration
#

A filtration on a measurable space \((\Omega , \mathcal{A})\) with measure \(P\) indexed by a preordered set \(T\) is a family of sigma-algebras \(\mathcal{F} = (\mathcal{F}_t)_{t \in T}\) such that for all \(i \le j\), \(\mathcal{F}_i \subseteq \mathcal{F}_j\) and for all \(t \in T\), \(\mathcal{F}_t \subseteq \mathcal{A}\).

Definition 7.2
#

A process \(X : T \to \Omega \to E\) is said to be adapted with respect to a filtration \(\mathcal{F}\) if for all \(t \in T\), \(X_t\) is \(\mathcal{F}_t\)-measurable.

Definition 7.3 Progressively measurable
#

A stochastic process \(X\) is said to be progressively measurable with respect to a filtration \(\mathcal{F}\) if at each point in time \(i\), \(X\) restricted to \((-\infty , i] \times \Omega \) is measurable with respect to the product \(\sigma \)-algebra where the \(\sigma \)-algebra used for \(\Omega \) is \(\mathcal{F}_i\).

If a stochastic process \((X_t)_{t \in T}\) is right continuous and adapted, then it is progressively measurable.

Proof

Fixing \(t \in T\), we need to show that \(X\) restricted to \([0, t] \times \Omega \) is measurable with respect to \(\mathcal{B}([0, t]) \otimes \mathcal{F}_t\). To this end, we define a left continuous discrete approximation of \(X\) on \([0, t]\) by

\[ X^n_s = X_{(k + 1)t 2^{-n}} \text{ for } s \in (k t 2^{-n}, (k + 1) t 2^{-n}] \]

and \(X^n_0 = X_0\). As \(X\) is right continuous, it is easy to see that \(X^n \to X\) pointwise as \(n \to \infty \). Thus, as each \(X^n\) is progressively measurable, it follows that \(X\) is also progressively measurable (by e.g. using MeasureTheory.progMeasurable_of_tendsto).

Definition 7.5 Martingale
#

Let \(\mathcal{F}\) be a filtration on a measurable space \(\Omega \) with measure \(P\) indexed by \(T\). A family of functions \(M : T \to \Omega \to E\) is a martingale with respect to a filtration \(\mathcal{F}\) if \(M\) is adapted with respect to \(\mathcal{F}\) and for all \(i \le j\), \(P[M_j \mid \mathcal{F}_i] = M_i\) almost surely.

Definition 7.6 Submartingale
#

Let \(\mathcal{F}\) be a filtration on a measurable space \(\Omega \) with measure \(P\) indexed by \(T\). A family of functions \(M : T \to \Omega \to E\) is a submartingale with respect to a filtration \(\mathcal{F}\) if \(M\) is adapted with respect to \(\mathcal{F}\) and for all \(i \le j\), \(P[M_j \mid \mathcal{F}_i] \ge M_i\) almost surely.

Lemma 7.7

Let \(X\) be a real-valued submartingale with respect to a filtration \(\mathcal{F}\). Then for all \(i \le j\), we have \(0 \le P[M_j - M_i \mid \mathcal{F}_i]\) almost surely.

Proof

Let \(X\) be a submartingale. Then for all bounded stopping times \(\tau \), the stopped value \(X_\tau \) is integrable.

Proof
Lemma 7.9

If \(X\) is a martingale and \(Y\) is an adapted modification of \(X\), then \(Y\) is a martingale.

Proof

Let \(i \le j\) in \(T\). We want to show that \(P[Y_j \mid \mathcal{F}_i] = Y_i\) almost surely. It suffices to show that \(\int _A Y_j \: dP = \int _A Y_i \: dP\) for all \(A \in \mathcal{F}_i\). Let then \(A \in \mathcal{F}_i\).

\begin{align*} \int _A Y_j \: dP & = \int _A X_j \: dP = \int _A X_i \: dP = \int _A Y_i \: dP \: . \end{align*}
Lemma 7.10

If \(X\) is a submartingale and \(Y\) is an adapted modification of \(X\), then \(Y\) is a submartingale.

Proof

Let \(i \le j\) in \(T\). We want to show that \(P[Y_j \mid \mathcal{F}_i] \ge Y_i\) almost surely. It suffices to show that \(\int _A Y_j \: dP \ge \int _A Y_i \: dP\) for all \(A \in \mathcal{F}_i\). Let then \(A \in \mathcal{F}_i\).

\begin{align*} \int _A Y_j \: dP & = \int _A X_j \: dP \ge \int _A X_i \: dP = \int _A Y_i \: dP \: . \end{align*}
Lemma 7.11 Jensen’s inequality for the conditional expectation
#

Let \(X : \Omega \to E\) be an integrable random variable with values in a normed space \(E\) and let \(\phi : E \to \mathbb {R}\) be a convex function such that \(\phi \circ X\) is integrable. Then, for any sub-\(\sigma \)-algebra \(\mathcal{G}\), we have

\begin{align*} \phi \left( \mathbb {E}[X \mid \mathcal{G}] \right) & \le \mathbb {E}[\phi (X) \mid \mathcal{G}] \quad \text{a.s.} \end{align*}
Proof

Done in a Mathlib PR for finite measures: #27953.

Corollary 7.12
#

Let \(X : \Omega \to E\) be an integrable random variable with values in a normed space \(E\). Then, for any sub-\(\sigma \)-algebra \(\mathcal{G}\), we have

\begin{align*} \Vert \mathbb {E}[X \mid \mathcal{G}] \Vert & \le \mathbb {E}[\Vert X \Vert \mid \mathcal{G}] \quad \text{a.s.} \end{align*}
Proof

Let \(X : T \rightarrow \Omega \rightarrow E\) a martingale with values in a normed space \(E\). Let \(\phi : E \rightarrow \mathbb {R}\) convex and continuous such that \(\phi (X_t)\in L^1(\Omega )\) for every \(t\in T\). Then \(\phi (X)\) is a sub-martingale.

Proof

By the conditional Jensen inequality (Lemma 7.11), \(\phi (X_t) = \phi \left( \mathbb {E}[X_T\ |\ \mathcal{F}_t] \right)\leq \mathbb {E}[\phi (X_T)\ |\ \mathcal{F}_t]\).

Let \(X : T \rightarrow \Omega \rightarrow E\) a martingale with values in a normed space \(E\). Then \(\Vert X \Vert \) is a sub-martingale.

Proof

Same proof as Lemma 7.13, specialized to \(\phi = \Vert \cdot \Vert \), for which we can use Corollary 7.12.

Lemma 7.15

Let \(X : T \rightarrow \Omega \rightarrow E\) a sub-martingale. Let \(\phi :E \rightarrow \mathbb {R}\) convex, continuous, and increasing such that \(\phi (X_t)\in L^1(\Omega )\) for every \(t\in T\). Then \(\phi (X)\) is a sub-martingale.

Proof

By Jensen and the fact that \(\phi \) is increasing \(\phi (X_t) \leq \phi \left( \mathbb {E}[X_T\ |\ \mathcal{F}_t] \right)\leq \mathbb {E}[\phi (X_T)\ |\ \mathcal{F}_t]\).

Definition 7.16 Stopping time
#

A stopping time with respect to some filtration \(\mathcal{F}\) indexed by \(T\) is a function \(\tau : \Omega \to T \cup \{ \infty \} \) such that for all \(i\), the preimage of \(\{ j \mid j \le i\} \) along \(\tau \) is measurable with respect to \(\mathcal{F}_i\).

Definition 7.17 \(\sigma \)-algebra generated by a stopping time

Given a stopping time \(\tau \) on a time index \(T\), define \(\mathcal{F}_\tau = \bigcup _{t \in T} \{ A \in \mathcal{F} \mid A \cap \{ \tau \le t\} \in \mathcal{F}_t\} .\)

Lemma 7.18

Let \(\tau , \sigma \) be stopping times such that \(\tau \le \sigma \). Then, \(\mathcal{F}_\tau \subseteq \mathcal{F}_\sigma \).

Proof
Definition 7.19 Stopped process
#

Let \(X : T \to \Omega \to E\) be a stochastic process and let \(\tau : \Omega \to T\). The stopped process with respect to \(\tau \) is defined by

\begin{align*} (X^{\tau })_t = \begin{cases} X_t & \text{if } t \le \tau \\ X_{\tau } & \text{otherwise} \end{cases}\end{align*}
Lemma 7.20

Let \(X : \mathbb {N} \to \Omega \to \mathbb {R}\) be a sub-martingale and \(\tau \) a stopping time with respect to the filtration \(\mathcal{F}\). Then, the stopped process \(X^{\tau }\) is a sub-martingale with respect to the filtration \(\mathcal{F}\).

Proof
Definition 7.21 Hitting time
#

For \(X : T \to \Omega \to E\) a stochastic process, \(B\) a subset of \(E\) and \(t_0 \in T\), the hitting time of \(X\) in \(B\) after \(t_0\) is the random variable \(\Omega \to T\cup \{ \infty \} \) defined by

\begin{align*} \tau _{B, t_0}(\omega ) = \inf \{ t \in T \mid t \ge t_0, \: X_t(\omega ) \in B\} \: , \end{align*}

in which the infimum is infinite if the set is empty.

7.2 Right-continuous filtrations

We now give the definition of a filtered probability space satisfying the usual conditions.

Definition 7.22 Left continuation

Assume that \(T\) is a partial order. For \(\mathcal{F}\) a filtration indexed by \(T\) and \(t \in T\), we define the left continuation as

\[ \mathcal{F}_{t-} = \begin{cases} \mathcal{F}_t, & \text{if $t$ is isolated on the left in the order topology}; \\ \bigsqcup _{s {\lt} t} \mathcal{F}_s, & \text{otherwise}. \end{cases} \]

Note that \(\bigsqcup \) denotes the supremum in the lattice of sigma-algebras on \(\Omega \).

Definition 7.23 Right continuation
#

Assume that \(T\) is a partial order. For \(\mathcal{F}\) a filtration indexed by \(T\) and \(t \in T\), we define the right continuation as

\[ \mathcal{F}_{t+} = \begin{cases} \mathcal{F}_t, & \text{if $t$ is isolated on the right in the order topology}; \\ \bigsqcap _{s {\gt} t} \mathcal{F}_s, & \text{otherwise}. \end{cases} \]

Note that \(\bigsqcap \) denotes the infimum in the lattice of sigma-algebras on \(\Omega \).

Lemma 7.24

The right continuation of \(\mathcal{F}\) is a filtration.

Proof

We endow \(T\) with the order topology. Let us first prove that the right continuation is nondecreasing. Let \(s, t \in T\) such that \(s \le t\).

  • Suppose \(s\) is isolated on the right. Then \(\mathcal{F}_{s+} = \mathcal{F}_s\).

    • If \(t\) is isolated on the right, then \(\mathcal{F}_{t+} = \mathcal{F}_t\). Because \(\mathcal{F}\) is a filtration, we have \(\mathcal{F}_s \subseteq \mathcal{F}_t\), and thus \(\mathcal{F}_{s+} \subseteq \mathcal{F}_{t+}\).

    • If \(t\) is not isolated on the right, then \(\mathcal{F}_{t+} = \bigsqcap _{u {\gt} t} \mathcal{F}_{u}\). Let \(u {\gt} t\). Then \(s \le u\), thus \(\mathcal{F}_s \subseteq \mathcal{F}_u\). This proves that \(\mathcal{F}_s \subseteq \bigsqcap _{u {\gt} t}, \mathcal{F}_u\), and thus \(\mathcal{F}_{s+} \subseteq \mathcal{F}_{t+}\).

  • Suppose now that \(s\) is not isolated on the right, so that \(\mathcal{F}_{s+} = \bigsqcup _{u {\gt} s} \mathcal{F}_u\).

    • If \(t\) is isolated on the right, then \(\mathcal{F}_{t+} = \mathcal{F}_t\). As \(s\) is not isolated on the right, we deduce that \(s \ne t\), and thus \(s {\lt} t\) because \(T\) is a partial order. Therefore \(\bigsqcap _{u {\gt} s} \mathcal{F}_u \subseteq \mathcal{F}_t\), and thus \(\mathcal{F}_{s+} \subseteq \mathcal{F}_{t+}\).

    • If \(t\) is not isolated on the right, then \(\mathcal{F}_{t+} = \bigsqcap _{u {\gt} t} \mathcal{F}_u\). For any \(u {\gt} t\), we have \(u {\gt} s\) and thus \(\bigsqcap _{v {\gt} s} \mathcal{F}_v \subseteq \mathcal{F}_u\), proving that \(\bigsqcap _{u {\gt} s} \mathcal{F}_u \subseteq \bigsqcap _{u {\gt} t} \mathcal{F}_u\), and thus \(\mathcal{F}_{s+} \subseteq \mathcal{F}_{t+}\).

Turn now to the proof that for all \(t \in T\), \(\mathcal{F}_{t+} \subseteq \mathcal{A}\). Let \(t \in T\). If \(t\) is isolated on the right, then \(\mathcal{F}_{t+} = \mathcal{F}_t \subseteq \mathcal{A}\) by definition of a filtration. Otherwise there exists \(u {\gt} t\), and \(\mathcal{F}_{t+} \subseteq \mathcal{F}_u \subseteq \mathcal{A}\), so we are done.

Lemma 7.25

Suppose \(T\) is a topological space with the topology being the order topology. For \(t \in T\), the right continuation of \(\mathcal{F}\) at \(t\) is given by

\[ \mathcal{F}_{t+} = \begin{cases} \mathcal{F}_t, & \text{if $t$ is isolated on the right}; \\ \bigsqcap _{s {\gt} t} \mathcal{F}_s, & \text{otherwise}. \end{cases} \]
Proof

This follows from Definition 7.23 because the topology on \(T\) agrees with the one used to define the right continuation.

Suppose \(T\) is a topological space with the topology being the order topology. Assume that \(t \in T\) is isolated on the right, meaning that there exists a neighbourhood \(\mathcal{V}\) of \(t\) such that \(\mathcal{V} \cap \{ u \mid u {\gt} t\} = \emptyset \). Then \(\mathcal{F}_{t+} = \mathcal{F}_t\).

Proof

This is a direct consequence of Lemma 7.25.

Lemma 7.27

Assume that \(T\) is a linear order with successor. This means that for any \(t\), there is an element \(succ(t) \ge t\) such that for any \(u {\gt} t\), \(u \ge succ(t)\), and if \(succ(t) \le t\), then \(t\) is maximal. Then the right continuation of \(\mathcal{F}\) is equal to \(\mathcal{F}\).

Proof

Endow \(T\) with the order topology. In a linear order with successor equipped with the order topology, every point is isolated on the right, so we can conclude by Lemma 7.26.

Lemma 7.28

If \(t \in T\) is maximal, \(\mathcal{F}_{t+} = \mathcal{F}_t\).

Proof

Endow \(T\) with the order topology. As \(t\) is maximal, it is isolated on the right for this topology, so we can conclude by Lemma 7.26.

If \(T\) is a linear order and there exists \(u {\gt} t\) such that \((t, u) = \emptyset \), then \(\mathcal{F}_{t+} = \mathcal{F}_t\).

Proof

Endow \(T\) with the order topology. The hypothesis implies that \(t\) is isolated on the right in this topology, so we can conclude by Lemma 7.26.

Suppose \(T\) is a topological space with the topology being the order topology. Assume that \(t \in T\) is not isolated on the right, meaning that for all neighbourhood \(\mathcal{V}\) of \(t\), \(\mathcal{V} \cap \{ u | u {\gt} t\} \ne \emptyset \). Then \(\mathcal{F}_{t+} = \bigsqcap _{u {\gt} t} \mathcal{F}_u\).

Proof

This is a direct consequence of Lemma 7.25.

Assume that \(T\) is a densely ordered linear order, meaning that for all \(s {\lt} t\), there exists \(u\) such that \(s {\lt} u {\lt} t\). If \(t\) is not maximal, then \(\mathcal{F}_{t+} = \bigsqcap _{u {\gt} t} \mathcal{F}_u\).

Proof

Endow \(T\) with the order topology. In a densely ordered linear order, a point which is not maximal is not isolated on the right, so we can conclude by Lemma 7.30.

Lemma 7.32

If \(T\) is a densely ordered linear order with no maximal element, then forall \(t \in T\) we have \(\mathcal{F}_{t+} = \bigsqcap _{u {\gt} t} \mathcal{F}_u\).

Proof

For all \(t\), \(t\) is not maximal, so we can conclude by Lemma 7.31.

Lemma 7.33

The filtration \(\mathcal{F}\) is contained in its right continuation.

Proof

Endow \(T\) with the order topology, and consider \(t \in T\). Using Lemma 7.25, we split into two cases. If \(t\) is isolated on the right, then \(\mathcal{F}_{t+} = \mathcal{F}_t \supseteq \mathcal{F}_t\) and we are done. Otherwise, for all \(u {\gt} t\), \(\mathcal{F}_t \subseteq \mathcal{F}_u\), therefore \(\mathcal{F}_t \subseteq \bigsqcap _{u {\gt} t} \mathcal{F}_u\), and we are done.

Lemma 7.34

The right continuation of the right continuation of \(\mathcal{F}\) is equal to the right continuation of \(\mathcal{F}\).

Proof

Let \(t \in T\). From Lemma 7.33, we already now that \(\mathcal{F}_{t+} \subseteq \mathcal{F}_{t++}\). Endow \(T\) with the order topology and split according to Lemma 7.25. If \(t\) is isolated on the right, then \(\mathcal{F}_{t++} = \mathcal{F}_{t+}\) and we are done. Otherwise consider \(u {\gt} t\). Then there exists \(v \in T\) such that \(t {\lt} v {\lt} u\). If \(v\) is not isolated on the right then

\[ \mathcal{F}_{t++} = \bigsqcap _{s {\gt} t} \mathcal{F}_{s+} \subseteq \mathcal{F}_{v+} = \bigsqcap _{s {\gt} v} \mathcal{F}_s \subseteq \mathcal{F}_u, \]

and otherwise

\[ \mathcal{F}_{t++} = \bigsqcap _{s {\gt} t} \mathcal{F}_{s+} \subseteq \mathcal{F}_{v+} = \mathcal{F}_v \subseteq \mathcal{F}_u, \]

thus \(\mathcal{F}_{t++} \subseteq \bigsqcap _{s {\gt} t} \mathcal{F}_s = \mathcal{F}_{t+}\), which concludes the proof.

Lemma 7.35 Basic properties of the right continuation

Fake lemma for the dependency graph. Import this to depend on Definition 7.23.

Proof
Definition 7.36 Right-continuous filtration

We say that the filtration is right-continuous if for all \(t \in T\), \(\mathcal{F}_{t+} \subseteq \mathcal{F}_t\).

If \(\mathcal{F}\) is right-continuous, then for all \(t \in T\), \(\mathcal{F}_t = \mathcal{F}_{t+}\).

Proof

This is a direct consequence of Definition 7.36 and Lemma 7.33.

The right continuation of \(\mathcal{F}\) is right-continuous.

Proof

This follows immediately from Lemma 7.34.

If \(\mathcal{F}\) is right-continuous, then for all \(t \in T\), any set \(A \subseteq \Omega \) which is \(\mathcal{F}_t\)-measurable is also \(\mathcal{F}_{t+}\)-measurable.

Proof

This is a direct consequence of Definition 7.36.

Lemma 7.40 Basic properties of right continuous filtrations

Fake lemma for the dependency graph. Import this to depend on Definition 7.36.

Proof
Definition 7.41 Usual conditions

We say that a filtered probability space \((\Omega , \mathcal{F}, P)\) satisfies the usual conditions if the filtration is right-continuous and if \(\mathcal{F}_0\) contains all the \(P\)-null sets.

7.3 Predictable processes

Definition 7.42 Predictable \(\sigma \)-algebra
#

Let \(\mathcal{F}\) be a filtration on a measurable space indexed \(\Omega \) by a linearly ordered set \(T\). Let \(S = \{ \{ \bot \} \times A \mid A \in \mathcal{F}_\bot \} \) if \(T\) has a bottom element and \(S = \emptyset \) otherwise. The predictable sigma-algebra on \(T \times \Omega \) is the sigma-algebra generated by the set of sets \(\{ (t, \infty ] \times A \mid t \in T, \: A \in \mathcal{F}_t\} \cup S\).

Definition 7.43 Predictable process
#

A process \(X : T \to \Omega \to E\) is said to be predictable with respect to a filtration \(\mathcal{F}\) if it is measurable with respect to the predictable sigma-algebra on \(T \times \Omega \).

Lemma 7.44

A predictable process is progressively measurable.

Proof

Let \(X : T \times \Omega \to E\) be a predictable process, we will show that it is progressively measurable. Namely, fixing \(t \in T\), denoting

\[ \iota _t : [0, t] \to T : s \mapsto s \]

we need to show that \(\iota _t \circ X : [0, t] \times \Omega \to E\) is measurable with respect to \(\mathcal{B}([0, t]) \otimes \mathcal{F}_t\).

Denoting \(\Sigma _{\mathcal{F}}\) for the predictable \(\sigma \)-algebra generated by \(\mathcal{F}\), as \(u\) is predictable, we have that \(X^{-1}(\mathcal{B}(E)) \le \Sigma _{\mathcal{F}}\). Thus, to show that \(\iota _t \circ X\) is \(\mathcal{B}([0, t]) \otimes \mathcal{F}_t\)-measurable, it suffices to show that \(\iota _t^{-1}(\Sigma _{\mathcal{F}}) \le \mathcal{B}([0, t]) \otimes \mathcal{F}_t\). In particular, as

\[ \Sigma _{\mathcal{F}} = \sigma (\{ (s, \infty ) \times A \mid A \in \mathcal{F}_s\} \cup \{ \{ \perp \} \times A \mid A \in \mathcal{F}_\perp \} ) \]

is suffices to show that sets of the form \(\iota _t^{-1}((s, \infty ) \times A)\) for some \(s \in T, A \in \mathcal{F}_s\) and \(\iota _t^{-1}(\{ \bot \} \times A)\) for some \(A \in \mathcal{F}_\bot \) are \(\mathcal{B}([0, t]) \otimes \mathcal{F}_t\)-measurable.

Indeed, if \(A \in \mathcal{F}_\bot \)

\[ \iota _t^{-1}(\{ \bot \} \times A) = \{ \bot \} \times A \]

while for any \(s \in T\) and \(A \in \mathcal{F}_s\),

\[ \iota _t^{-1}((s, \infty ) \times A) = \begin{cases} \varnothing , & t {\lt} s\\ (s, t] \times A, & s \le t. \end{cases} \]

By the monotonicity of the filtration \(\mathcal{F}\), all of these cases are \(\mathcal{B}([0, t]) \otimes \mathcal{F}_t\)-measurable allowing us to conclude.

Lemma 7.45

Sets of the form \((s, t] \times A\) for any \(A \in \mathcal{F}_s\) is measurable with respect to the predictable \(\sigma \)-algebra.

Proof

For \(t \le s\), the set in question is empty and thusly, trivially measurable. On the other hand, for \(s {\lt} t\), measurability follows as \((s, t] \times A = (s, \infty ) \times A \setminus (t, \infty ) \times A\).

Let \(X : \mathbb {N} \to \Omega \to E\) be a stochastic process and let \(\mathcal{F}\) be a filtration indexed by \(\mathbb {N}\). Then \(X\) is predictable if and only if \(X_0\) is \(\mathcal{F}_0\)-measurable and for all \(n \in \mathbb {N}\), \(X_{n+1}\) is \(\mathcal{F}_n\)-measurable.

Proof

Suppose first that \(X\) is predictable. Straightaway, \(X_0\) is \(\mathcal{F}_0\)-measurable as predictable implies progressively measurable which in turn implies adapted.

Fixing \(n\), we observe that for any \(S \in \mathcal{B}(E)\),

\[ X_{n + 1}^{-1}(S) = \{ \omega \mid (n + 1, \omega ) \in X^{-1}(S)\} = \pi ^{-1}(\iota ^{-1}(X^{-1}(S))) \]

where

\[ \pi : \Omega \to \{ n + 1\} \times \Omega : \omega \mapsto (n + 1, \omega ) \]

and

\[ \iota : \{ n + 1\} \times \Omega \to T \times \Omega : (n + 1, \omega ) \mapsto (n + 1, \omega ). \]

As \(X^{-1}(S) \in \Sigma _{\mathcal{F}}\) – the predictable \(\sigma \)-algebra, it suffices to show that \(\pi ^{-1}(\iota ^{-1}(\Sigma _{\mathcal{F}})) \in \mathcal{F}_n\). To this end, we again only need to show these for the generating sets of \(\Sigma _{\mathcal{F}}\):

  • For \(A \in \mathcal{F}_0\), measurability is clear as \(\iota ^{-1}(\{ 0\} \times A) = \varnothing \).

  • Similarly, for \(m {\gt} n\) and \(A \in \mathcal{F}_m\), \(\iota ^{-1}((m, \infty ) \times A) = \varnothing \).

  • For \(m \le n\) and \(A \in \mathcal{F}_m \le \mathcal{F}_n\) we have that \(\pi ^{-1}(\iota ^{-1}((m, \infty ) \times A)) = A\) which is \(\mathcal{F}_n\) measurable by the monotonicity of the filtration.

Now, supposing \(X_0\) is \(\mathcal{F}_0\)-measurable and \(X_{n + 1}\) is \(\mathcal{F}_n\)-measurable, we will show that \(X\) is predictable. Indeed, fixing \(S \in \mathcal{B}(E)\), we have

\[ X^{-1}(S) = \bigcup _{n \in \mathbb {N}} \{ n\} \times X_n^{-1}(S) = {0} \times X_0^{-1}(S) \cup \bigcup _{n \in \mathbb {N}} \{ n + 1\} \times X_{n + 1}^{-1}(S). \]

Thus, as \(\{ 0\} \times X_0^{-1}(S) \in \Sigma _{\mathcal{F}}\) by construction and \(\{ n + 1\} \times X_{n + 1}^{-1}(S) = (n, n + 1] \times X_{n + 1}^{-1}(S) \in \Sigma _{\mathcal{F}}\) by Lemma 7.45 and the fact that \(X_{n + 1}^{-1}(S) \in \mathcal{F}_n\), we have that \(X^{-1}(S) \in \Sigma _{\mathcal{F}}\) as required.

7.4 Uniformly integrable

Lemma 7.47
#

If \((X_i)_{i \in \iota }\) is a family of (probabilistically) uniformly integrable functions and \((\mathcal{F}_j)_{j \in \kappa }\) is a family of \(\sigma \)-algebras, then the family \((P[X_i \mid \mathcal{F}_j])_{i \in \iota , j \in \kappa }\) is uniformly integrable.

Proof

Since \((X_i)_{i \in \iota }\) is uniformly integrable, it is uniformly bounded in \(L^1\), thus so is \((P[X_i \mid \mathcal{F}_j])_{i \in \iota , j \in \kappa }\). Moreover, for any \(\epsilon {\gt} 0\), there exists some \(\delta {\gt} 0\) such that for any measurable set \(A\) with \(P(A) {\lt} \delta \), we have that \(\sup _{i \in \iota } P[|X_i| \mathbb {I}_A] {\lt} \epsilon \).

On the other hand, by Markov’s inequality, for any \(\lambda {\gt} 0\), \(i \in \iota \) and \(j \in \kappa \) we have that

\[ P(|P[X_i \mid \mathcal{F}_j]| \ge \lambda ) \le \lambda ^{-1}P[|P[X_i \mid \mathcal{F}_j]|] \le \lambda ^{-1}P[|X_i|]. \]

Now set \(\lambda := \delta ^{-1} \sup _{i \in \iota } P[|X_i|] + 1\). Then for any \(i \in \iota \) and \(j \in \kappa \) we have that

\[ P(|P[X_i \mid \mathcal{F}_j]| \ge \lambda ) \le \frac{P[|X_i|]}{\delta ^{-1} \sup _{k \in \iota } P[|X_k|] + 1} {\lt} \delta , \]

and so,

\begin{align*} P[|P[X_i \mid \mathcal{F}_j]| \mathbb {I}_{|P[X_i \mid \mathcal{F}_i]| \ge \lambda }] & = P[|P[X_i \mid \mathcal{F}_j] \mathbb {I}_{|P[X_i \mid \mathcal{F}_i]| \ge \lambda }|] \\ & = P[|P[X_i \mathbb {I}_{|P[X_i \mid \mathcal{F}_i]| \ge \lambda } \mid \mathcal{F}_j]|] \\ & \le P[P[|X_i|\mathbb {I}_{|P[X_i \mid \mathcal{F}_j]| \ge \lambda } \mid \mathcal{F}_j]] \\ & = P[|X_i|\mathbb {I}_{|P[X_i \mid \mathcal{F}_j]| \ge \lambda }] {\lt} \epsilon , \end{align*}

showing that \((P[X_i \mid \mathcal{F}_j])_{i \in \iota , j \in \kappa }\) is uniformly integrable.

Let \(X\) be a martingale on a discrete index set and let \((\tau _k)_{k \in \mathbb {N}}\) be a sequence of stopping times that are uniformly bounded by \(n\). Then, the family of stopped values \(\{ X_{\tau _k}\} _{k \in \mathbb {N}}\) is uniformly integrable.

Proof

By optional sampling (Lemma 7.63), we have that for each \(k\), \(X_{\tau _k} = P[X_n \mid \mathcal{F}_{\tau _k}]\). Thus, the result follows by Lemma 7.47 as \(\{ X_n\} \) is uniformly integrable.

Let \(X\) be a martingale and let \((\tau _k)_{k \in \mathbb {N}}\) be a sequence of stopping times that are uniformly bounded by \(n\). Then, the family of stopped values \(\{ X_{\tau _k}\} _{k \in \mathbb {N}}\) is uniformly integrable if for each \(k\), \(\tau _k\) takes value in a countable set.

Proof

Same proof as in Lemma 7.48.

Lemma 7.50
#

Let \((X_t)_{t \in T}\) and \((Y_t)_{t \in T}\) be two families of uniformly integrable random variables. Then the family \((X_t + Y_t)_{t \in T}\) is uniformly integrable.

Proof

The families \(X\) and \(Y\) are uniformly integrable in the measure-theoretic sense and almost-everywhere strongly measurable, so \(X + Y\) is too (see MeasureTheory.UnifIntegrable.add). Moreover, \(X\) and \(Y\) are bounded in \(L^p\), so \(X + Y\) is too. So \(X + Y\) is uniformly integrable.

Lemma 7.51

Let \((X_s)_{s \in S}\) be a family of random variables and \((Y_t)_{t \in T}\) be a family of uniformly integrable random variables.. If for all \(s\), there exists \(t\) such that \(\| X_t\| \le \| Y_s\| \) almost surely, then \(X\) is uniformly integrable.

Proof

Let \(\epsilon {\gt} 0\). The family \(Y\) is uniformly integrable, thus there exists \(C \ge 0\) such that for \(t \in T\), \(P[\| Y_t\| ^p \mathbb {I}_{\| Y_t\| \ge C}]^{1/p} \le \epsilon \). For all \(s\), there exists \(t\) such that \(\| X_s\| ^p \le \| Y_t\| ^p\), so \(P[\| X_s\| ^p \mathbb {I}_{\| X_s\| \ge C}]^{1/p} \le \epsilon \). Thus \(X\) is uniformly integrable.

Lemma 7.52

Let \((X_t)_{t \in T}\) be a family of random variables and \(Y\) be a real random variable in \(L^p\). If for all \(t\), \(\| X_t\| \le Y\) almost surely, then \(X\) is uniformly integrable.

Proof

Because \(Y\) is in \(L^p\), we deduce that \(\{ Y\} \) is uniformly integrable. The conclusion then follows from Lemma 7.51.

Lemma 7.53
#

If \((X_t)_{t \in T}\) is a family of uniformly integrable random variables, then so is \((\| X_t\| )_{t \in T}\).

Proof

Apply Lemma 7.51 with \(Y := X\).

Lemma 7.54
#

Let \((X_t)_{t \in T}\) be a family of uniformly integrable random variables. It is uniformly integrable if and only if \((\| X_t\| )_{t \in T}\) is.

Proof

The forward direction is Lemma 7.53. The converse direction follows from Lemma 7.51 with \(Y := (\| X_t\| )_{t \in T}\).

Lemma 7.55
#

If \((X_t)_{t \in T}\) is uniformly integrable and \(\phi : S \to T\), then \((X_{\phi (s)})_{s \in S}\) is uniformly integrable.

Proof

This is immediate from the definition.

Let \(X\) be a submartingale on a discrete index set and let \((\tau _k)_{k \in \mathbb {N}}\) be a sequence of stopping times that are uniformly bounded by \(p\). Then, the family of stopped values \(\{ X_{\tau _k}\} _{k \in \mathbb {N}}\) is uniformly integrable.

Proof

Use Doob decomposition to write \(X_n = M_n + A_n\), where \(M\) (Definition 12.5) is a martingale (Lemma 12.7) and \(A\) (Definition 12.1) is a predictable process (Lemma 12.6). We know from Lemma 7.48 that \((M_{\tau _k})_{k \in \mathbb {N}}\) is uniformly integrable. Combining Lemma 7.50 and Lemma 7.52, it suffices to show that \((A_{\tau _k})_{k \in \mathbb {N}}\) is dominated. It is dominated by \(A_p\) thanks to Lemma 12.2 and Lemma 12.8.

Lemma 7.57

Let \((X_n)_{n \in \mathbb {N}}\) be a sequence of \(p\)-uniformly integrable stochastic processes and suppose \(X_n \to X\) in probability as \(n \to \infty \). Then, \(X\) is \(L^p\).

Proof

Since \(X_n \to X\) in probability, it has a subsequence \((X_{n_k}) \subseteq (X_n)\) which converges to \(X\) almost surely. Thus, we have by Fatou’s lemma that

\[ P[|X|^p] = P[\liminf _{k \to \infty } |X_{n_k}|^p] \le \liminf _{k \to \infty } P[|X_{n_k}|^p] {\lt} \infty \]

where the last inequality follows as uniform integrability implies that \((X_n)\) is uniform bounded in \(L^p\).

Let \((X_t)_{t \in T}\) be a family of \(p\)-uniformly integrable stochastic processes. Then the family of limits in probability of sequences of \(X\) is uniformly integrable.

Proof

Let \(\epsilon {\gt} 0\). There exists \(\delta {\gt} 0\) such that for all \(t\in T\) and all measurable set \(S\) such that \(P(S){\lt}\delta \),

\[ P[\| X_t\| ^p\mathbb {I}_S]^{1/p}\le \varepsilon . \]

Let \((t_n)_{n \in \mathbb {N}}\) be a sequence in \(T\) such that \(X_{t_n}\) converges in probability to \(Y\). Then it has a subsequence \((X_{t_{n_k}})\) which converges to \(Y\) almost surely. Thus, we have by Fatou’s lemma that

\[ P[\| Y\| ^p \mathbb {I}_{S}]^{1/p} = P[\liminf _{k \to \infty } \| X_{t_{n_k}}\| ^p \mathbb {I}_{S}]^{1/p} \le \liminf _{k \to \infty } P[\| X_{t_{n_k}}\| ^p \mathbb {I}_{S}]^{1/p} \le \epsilon . \]

This proves that the family of limits in probability of sequences of \(X\) is uniformly integrable in the measure theory sense. One can prove uniform boundedness of this family by using Fatou’s lemma and the existence of an almost everywhere convergent subsequence in a similar way.

Lemma 7.59 Vitali convergence theorem

A sequence of functions converges in \(L^1\) if and only if it converges in probability and is uniformly integrable.

Proof

7.5 Optional sampling

Definition 7.60 Ordered Monoid
#

Let \((M, +)\) be a commutative monoid that is also a partial order. It is said to be an ordered monoid if for all \(a, b, c \in M\), we have the following implication:

\[ a \le b \implies a + c \le b + c. \]
Definition 7.61 Ordered Module
#

Let \(\alpha , \beta \) be preorders with \(0\) elements and such that there is a scalar multiplication \((\_ \cdot \_ ) : \alpha \times \beta \to \beta \). Then \(\beta \) is said to be an ordered \(\alpha \)-module (or ordered module if \(\alpha \) is clear from the context) if the following hold:

  • \(\forall a \in \alpha , \forall b_1, b_2 \in \beta , 0 \le a \implies b_1 \le b_2 \implies a \cdot b_1 \le a \cdot b_2\);

  • \(\forall a_1, a_2 \in \alpha , \forall b \in \beta , 0 \le b \implies a_1 \le a_2 \implies a_1 \cdot b \le a_2 \cdot b\).

Definition 7.62 Order-closed topology
#

Let \(X\) be a topological space that is also a preorder. The space \(X\) is set to be order-closed, or to have order-closed topology, if the set \(\{ (x, y) \in X \times X \mid x \le y\} \) is closed.

Lemma 7.63 Optional sampling (discrete time)

Let \(X\) be a discrete time martingale with respect to the filtration \(\mathcal{F}\) and let \(\tau , \sigma \) be stopping times. Then, if \(\tau \) is bounded, we have that almost surely, \(X_{\tau \wedge \sigma } = P[X_{\tau } \mid \mathcal{F}_{\sigma }]\).

Proof

Let \(X\) be a discrete time submartingale with respect to the filtration \(\mathcal{F}\) taking values in a real Banach space \(E\). Assume \(E\) is an order-closed partial order, an ordered monoid and an ordered module. Let \(\tau , \sigma \) be stopping times. Then, if \(\tau \) is bounded, we have that almost surely, \(X_{\tau \wedge \sigma } \le P[X_{\tau } \mid \mathcal{F}_{\sigma }]\).

Proof

Use Doob decomposition to write \(X_n = M_n + A_n\), where \(M\) (Definition 12.5) is a martingale (Lemma 12.7) and \(A\) (Definition 12.1) is a predictable process (Lemma 12.6). By Lemma 7.63, we have that almost surely, \(M_{\tau \wedge \sigma } = P[M_{\tau } \mid \mathcal{F}_{\sigma }]\). Because \(A\) is predictable and \(\tau \wedge \sigma \le \sigma \), we deduce that almost surely, \(A_{\tau \wedge \sigma } = P[A_{\tau \wedge \sigma } \mid \mathcal{F}_\sigma ]\). Moreover, by Lemma 12.8, we know that almost surely, \(A\) is nondecreasing. Therefore, using the fact \(\tau \wedge \sigma \le \tau \), we get that \(P[A_{\tau \wedge \sigma } \mid \mathcal{F}_\sigma ] \le P[A_\tau \mid \mathcal{F}_\sigma ]\). We deduce that almost surely,

\[ X_{\tau \wedge \sigma } = M_{\tau \wedge \sigma } + A_{\tau \wedge \sigma } \le P[M_\tau \mid \mathcal{F}_\sigma ] + P[A_\tau \mid \mathcal{F}_\sigma ] = P[X_\tau \mid \mathcal{F}_\sigma ], \]

concluding the proof.

Let \(X\) be a discrete time supermartingale with respect to the filtration \(\mathcal{F}\) taking values in a real Banach space \(E\). Assume \(E\) is an order-closed partial order, an ordered monoid and an ordered module. Let \(\tau , \sigma \) be stopping times. Then, if \(\tau \) is bounded, we have that almost surely, \(X_{\tau \wedge \sigma } \ge P[X_{\tau } \mid \mathcal{F}_{\sigma }]\).

Proof

We know that \(-X\) is a submartingale, so from Lemma 7.64 we obtain that almost surely, \(-X_{\tau \wedge \sigma } \le P[-X_{\tau } \mid \mathcal{F}_{\sigma }]\). Multiplying by \(-1\) yields the desired result.

Definition 7.66 Discrete approximation sequence
#

Given a stopping time \(\tau : \Omega \to T \cup \{ \infty \} \), a sequence of stopping times \((\tau _n)_{n \in \mathbb {N}}\) is called an discrete approximation of \(\tau \) if \(\tau _n(\Omega )\) is countable for each \(n\) and \(\tau _n \downarrow \tau \) a.s. as \(n \to \infty \).

Definition 7.67 Approximable time index

A time index set \(T\) is said to be approximable if for any stopping time \(\tau : \Omega \to T \cup \{ \infty \} \), there exists a discrete approximation sequence \((\tau _n)\) of \(\tau \).

Given a right continuous process \(X\) and a discrete approximation sequence \((\tau _n)\) of the stopping time \(\tau \), we have that

\[ \lim _{n \to \infty } X_{\tau _n} = X_\tau \text{ a.s.} \]
Proof

This follows directly as \(X\) is right continuous and \(\tau _n \downarrow \tau \) a.s.

Lemma 7.69

Let \(\tau \) be a stopping time bounded by \(t \in T\) and \((\tau _n)\) be a discrete approximation sequence of \(\tau \). Then, the sequence of stopping times \(\tau _n \wedge t\) is also a discrete approximation sequence of \(\tau \).

Proof

Let \(\tau \) be a stopping time bounded by \(t \in T\) and \((\tau _n)\) be a discrete approximation sequence of \(\tau \). Then, for any martingale \(X\), the sequence of stopped values \((X_{\tau _n \wedge t})\) is uniformly integrable.

Proof

Follows directly by Lemma 7.49 and Lemma 7.69.

Let \(\tau \) be a stopping time bounded by \(t \in T\) and \((\tau _n)\) be a discrete approximation sequence of \(\tau \). Then, for any right continuous martingale \(X\), \(X_{\tau } \in L^1\) and \(X_{\tau _n \wedge t} \to X_{\tau }\) in \(L^1\) as \(n \to \infty \).

Proof

By Lemma 7.68, as \(X\) is right continuous we have that \(X_{\tau _n \wedge t} \to X_{\tau }\) a.s. and so, also in probability. Moreover, by Lemma 7.70, the sequence \((X_{\tau _n \wedge t})\) is uniformly integrable. Thus, by Lemma 7.57 and the Vitali convergence theorem (Lemma 7.59), it follows that \(X_{\tau } \in L^1\) and \(X_{\tau _n \wedge t} \to X_{\tau }\) in \(L^1\) as \(n \to \infty \).

Lemma 7.72
#

\(T = \mathbb {R}_+\) is an approximable time index. In particular, for any stopping time \(\tau \) on \(\overline{\mathbb {R}_+}\), defining \(\tau _n = 2^{-n} \lceil 2^n \tau \rceil \), we have that \((\tau _n)\) is a discrete approximation sequence of \(\tau \).

Proof

Clearly \(\tau _n \downarrow \tau \) as \(n \to \infty \) and so it remains to show that each \(\tau _n\) is a stopping time. Indeed,

\[ \{ \tau _n \le t\} = \{ \tau \le 2^{-n} \lfloor 2^n t\rfloor \} \in \mathcal{F}_{2^{-n} \lfloor 2^n t\rfloor } \subseteq \mathcal{F}_t \]

where the last inclusion follows as \(2^{-n} \lfloor 2^n t\rfloor \le t\).

Lemma 7.73
#

\(T = \mathbb {N}\) is an approximable time index.

Proof

Immediate as we can take \(\tau _n = \tau \) for all \(n\).

Let \(X\) be a right-continuous \(\mathcal{F}\)-martingale on an approximable time index. Then, for any stopping times \(\sigma , \tau \) with \(\tau \) bounded, we have that \(X_{\sigma \wedge \tau } = P[X_{\tau } \mid \mathcal{F}_{\sigma }]\) almost surely.

Proof

Fixing \(A \in \mathcal{F}_{\sigma }\), we need to show that \(P[X_{\tau } \mathbb {I}_A] = P[X_{\sigma \wedge \tau } \mathbb {I}_A]\).

Let \((\tau _n), (\sigma _n)\) be discrete approximation sequences of \(\tau \) and \(\sigma \) respectively. As \(\tau _n, \sigma _n\) take values in a countable set, we have by the discrete time optional sampling theorem (Lemma 7.63) that

\[ X_{\sigma _n \wedge \tau _n} = P[X_{\tau _n} \mid \mathcal{F}_{\sigma _n}] \]

and so, as \(\mathcal{F}_{\sigma } \subseteq \mathcal{F}_{\sigma _n}\) by Lemma 7.18, we have that \(P[X_{\sigma _n \wedge \tau _n} \mathbb {I}_A] = P[X_{\tau _n} \mathbb {I}_A]\). On the other hand, by Lemma 7.48, the families \(\{ X_{\tau _n}\} \) and \(\{ X_{\sigma _n \wedge \tau _n}\} \) are uniformly integrable. Thus, as \(X\) is right-continuous, \((X_{\sigma _n \wedge \tau _n}, X_{\tau _n}) \to (X_{\sigma \wedge \tau }, X_{\tau })\) a.s. We have \(P[X_{\tau } \mathbb {I}_A] = P[X_{\sigma \wedge \tau } \mathbb {I}_A]\) by Lemma 7.71 as desired.

Let \(X\) be a right-continuous \(\mathcal{F}\)-submartingale on an approximable time index. Then, for any stopping times \(\sigma , \tau \) with \(\tau \) bounded, we have that \(X_{\sigma \wedge \tau } \le P[X_{\tau } \mid \mathcal{F}_{\sigma }]\) almost surely.

Proof

Fixing \(A \in \mathcal{F}_{\sigma }\), we need to show that \(P[X_{\tau } \mathbb {I}_A] \le P[X_{\sigma \wedge \tau } \mathbb {I}_A]\).

Let \((\tau _n), (\sigma _n)\) be discrete approximation sequences of \(\tau \) and \(\sigma \) respectively. As \(\tau _n, \sigma _n\) take values in a countable set, we have by the discrete time optional sampling theorem (Lemma 7.64) that

\[ X_{\sigma _n \wedge \tau _n} \le P[X_{\tau _n} \mid \mathcal{F}_{\sigma _n}] \]

and so, as \(\mathcal{F}_{\sigma } \subseteq \mathcal{F}_{\sigma _n}\) by Lemma 7.18, we have that \(P[X_{\sigma _n \wedge \tau _n} \mathbb {I}_A] \le P[X_{\tau _n} \mathbb {I}_A]\). On the other hand, by Lemma 7.56, the families \(\{ X_{\tau _n}\} \) and \(\{ X_{\sigma _n \wedge \tau _n}\} \) are uniformly integrable. Thus, as \(X\) is right-continuous, \((X_{\sigma _n \wedge \tau _n}, X_{\tau _n}) \to (X_{\sigma \wedge \tau }, X_{\tau })\) a.s. We have \(P[X_{\tau } \mathbb {I}_A] \le P[X_{\sigma \wedge \tau } \mathbb {I}_A]\) by Lemma 7.71 as desired.

7.6 Martingale convergence

Definition 7.76
#

Let \(X : T \to \Omega \to E\) be a stochastic process, let \(\mathcal{F}\) be a filtration on \(\Omega \) indexed by \(T\) and let \(P\) be a measure on \(\Omega \). If there exists a function \(Y : \Omega \to E\) which is measurable with respect to \(\mathcal{F}_\infty \) such that for \(P\)-almost surely, \(X_t\) converges to \(Y\) as \(t\) goes to infinity, then we say that \(Y\) is the limit of \(X\). We denote it by \(X_\infty \).

In Mathlib, we have results about convergence of martingales to their limit in discrete time.

Theorem 7.77

Let \(X\) be an uniformly integrable cadlag martingale with respect to the filtration \(\mathcal{F}\). Then there exists a limit process \(X_\infty \) measurable with respect to \(\mathcal{F}_\infty \) such that \(X_t\) converges to \(X_\infty \) almost surely as \(t\) goes to infinity. Furthermore, \(X_t = P[X_\infty \mid \mathcal{F}_t]\) almost surely.

Proof

7.7 Doob’s Lp inequality

In this section, we prove Doob’s Lp inequality.

Lemma 7.78 Doob’s maximal inequality for \(\mathbb {N}\)
#

Let \(X : \mathbb {N} \rightarrow \Omega \rightarrow \mathbb {R}\) be a non-negative sub-martingale. Then for every \(n \in \mathbb {N}\) and \(\lambda {\gt} 0\),

\begin{align*} \mathbb {P}\left(\sup _{i \le n}X_i\geq \lambda \right) \le \frac{\mathbb {E}\left[X_n \mathbb {I}_{\sup _{i \le n}X_i \ge \lambda }\right]}{\lambda } \le \frac{\mathbb {E}[X_n]}{\lambda } \: . \end{align*}
Proof
Lemma 7.79 Doob’s maximal Inequality for countable

Let \(X : I \rightarrow \Omega \rightarrow \mathbb {R}\) be a non-negative sub-martingale with \(I\) countable. Then for every \(M \in I,\lambda {\gt} 0\) and \(p{\gt}1\) we have

\begin{align*} P\left( \sup _{i\in I, i\leq M}X_i\geq \lambda \right) \le \frac{\mathbb {E}\left[X_M \mathbb {I}_{\sup _{i \le M}X_i \ge \lambda }\right]}{\lambda } \le \frac{\mathbb {E}[X_M]}{\lambda } \: . \end{align*}
Proof

For any finite subset \(J \subset I\) with \(M \in J\), we have by Lemma 7.78

\begin{align*} P\left( \sup _{i\in J, i \le M}X_i\geq \lambda \right) \le \frac{\mathbb {E}\left[X_{M} \mathbb {I}_{\sup _{i \in J, i \le M}X_i \ge \lambda }\right]}{\lambda } \: . \end{align*}

Then we build a countable increasing sequence of finite sets \(J_n\) with \(\sup _{i\in I, i\leq M}X_i = \sup _n\sup _{i\in J_n, i \le M}X_i\) and conclude by monotone convergence.

Lemma 7.80 Doob Lp Inequality for countable

Let \(X : I \rightarrow \Omega \rightarrow \mathbb {R}\) be a non-negative sub-martingale. Let \(I\) be countable. For every \(M\in I,\lambda {\gt} 0\) and \(p{\gt}1\) we have

\begin{align*} \mathbb {E}\left[ \sup _{i\in I, i \leq M}X_i^p \right] \leq \left(\frac{p}{p-1}\right)^p\mathbb {E}[X_M^p] \: . \end{align*}

That is, for \(\Vert \cdot \Vert _p\) the \(L^p\) norm, \(\left\Vert \sup _{i \le M} X_i \right\Vert _p \leq \frac{p}{p-1} \left\Vert X_M \right\Vert _p \: .\)

Proof
\begin{align*} \mathbb {E}\left[ \sup _{i \le M}X_i^p \right] = p \int _0^\infty \mathbb {P}\left( \sup _{i \le M}X_i \geq \lambda \right) \lambda ^{p-1} d\lambda \end{align*}

By Theorem 7.79 and then Fubini’s theorem, we have then

\begin{align*} \mathbb {E}\left[ \sup _{i \le M}X_i^p \right] & \le p \int _0^\infty \mathbb {E}\left[X_M \mathbb {I}_{\sup _{i \le M}X_i \ge \lambda }\right] \lambda ^{p-2} d\lambda \\ & = p \mathbb {E}\left[X_M \int _0^{\sup _{i \le M}X_i} \lambda ^{p-2} d\lambda \right] \\ & = \frac{p}{p - 1} \mathbb {E}\left[X_M (\sup _{i \le M}X_i)^{p-1}\right] \: . \end{align*}

Then by Hölder’s inequality,

\begin{align*} \mathbb {E}\left[ \sup _{i \le M}X_i^p \right] & \le \frac{p}{p - 1} \left(\mathbb {E}[X_M^p]\right)^{1/p} \left(\mathbb {E}\left[\sup _{i \le M}X_i^p \right]\right)^{(p-1)/p} \: . \end{align*}

We then divide the two sides by \(\left(\mathbb {E}\left[\sup _{i \le M}X_i^p \right]\right)^{(p-1)/p}\) and raise to the power \(p\) to conclude.

Theorem 7.81 Doob Inequality

Let \(X: \mathbb {R}_+ \to \Omega \to \mathbb {R}\) be a right-continuous non-negative sub-martingale. For every \(T \in \mathbb {R}_+\) and \(\lambda {\gt}0\) we have

\begin{align*} P\left( \sup _{t\in [0,T]}X_t \geq \lambda \right) \leq \frac{\mathbb {E}[X_T \mathbb {I}_{\sup _{i \le T}X_i \ge \lambda }]}{\lambda } \leq \frac{\mathbb {E}[X_T]}{\lambda } \: . \end{align*}
Proof

Since \(X\) is right-continuous and \([0,T]\) is a compact interval, we have that

\begin{align*} \sup _{t\in [0,T]}X_t = \sup _{t\in [0,T] \cap \mathbb {Q}}X_t \: . \end{align*}

Then apply Lemma 7.79 with \(I = [0,T] \cap \mathbb {Q}\) and \(M = T\).

Corollary 7.82 Doob Inequality for normed spaces

Let \(X:\mathbb {R}_+ \to \Omega \to E\) be a right-continuous martingale with values in a normed space \(E\). For every \(T\) and \(\lambda {\gt}0\) we have

\[ P\left( \sup _{t\in [0,T]} \lVert X_t \rVert \geq \lambda \right) \leq \frac{\mathbb {E}[\lVert X_T \rVert ]}{\lambda }. \]
Proof

By Corollary 7.14, \(\lVert X \rVert \) is a sub-martingale. Then apply Theorem 7.81.

Theorem 7.83 Doob’s Lp inequality in \(\mathbb {R}\)

Let \(X:\mathbb {R} \rightarrow \Omega \rightarrow \mathbb {R}\) be a right-continuous non-negative sub-martingale. For every \(T, \lambda {\gt}0\) and \(p{\gt}1\) we have

\begin{align*} \mathbb {E}\left[ \sup _{t\in [0,T]}X_t^p \right] \leq \left(\frac{p}{p-1}\right)^p\mathbb {E}[X_T^p] \: . \end{align*}

That is, for \(\Vert \cdot \Vert _p\) the \(L^p\) norm, \(\left\Vert \sup _{t\in [0,T]} X_t \right\Vert _p \leq \frac{p}{p-1} \left\Vert X_T \right\Vert _p \: .\)

Proof

Since \(X\) is right-continuous and \([0,T]\) is a compact interval, we have that

\begin{align*} \sup _{t\in [0,T]}X_t = \sup _{t\in [0,T] \cap \mathbb {Q}}X_t \: . \end{align*}

Then apply Lemma 7.80 with \(I = [0,T] \cap \mathbb {Q}\) and \(M = T\).

Corollary 7.84 Doob’s Lp inequality for normed spaces

Let \(X : \mathbb {R} \rightarrow \Omega \rightarrow E\) be a right-continuous martingale with values in a normed space \(E\). For every \(T, \lambda {\gt}0\) and \(p{\gt}1\) we have

\begin{align*} \mathbb {E}\left[ \sup _{t\in [0,T]} \lVert X_t \rVert ^p \right] \leq \left(\frac{p}{p-1}\right)^p\mathbb {E}[\lVert X_T \rVert ^p] \: . \end{align*}

That is, for \(\Vert \cdot \Vert _p\) the \(L^p\) norm, \(\left\Vert \sup _{t\in [0,T]} X_t \right\Vert _p \leq \frac{p}{p-1} \left\Vert X_T \right\Vert _p \: .\)

Proof

By Corollary 7.14, \(\lVert X \rVert \) is a sub-martingale. Then apply Theorem 7.83.

Lemma 7.85 Stopped Submartingale

Let \(X:\mathbb {R}_+ \to \Omega \to \mathbb {R}\) be a cadlag submartingale and \(\tau \) a stopping time. Then the stopped process \(X^\tau \) is a submartingale.

Proof
Lemma 7.86 Doob Inequality for stopping times

Let \(X:\mathbb {R}\times \Omega \rightarrow \mathbb {R}\) be a right-continuous non-negative sub-martingale. For every \(\lambda {\gt}0\) and \(p{\gt}1\) and \(\tau \) stopping time a.s. bounded by \(T{\gt}0\), we have

\[ P\left( \sup _{t\in [0,\tau ]}X_t\geq \lambda \right)\leq \frac{\mathbb {E}[X_\tau ]}{\lambda }. \]
Proof
Corollary 7.87 Doob Inequality for stopping times in normed spaces

Let \(X:\mathbb {R}\times \Omega \rightarrow E\) be a right-continuous martingale with values in a normed space \(E\). For every \(\lambda {\gt}0\) and \(p{\gt}1\) and \(\tau \) stopping time a.s. bounded by \(T{\gt}0\), we have

\[ P\left( \sup _{t\in [0,\tau ]}\lVert X_t \rVert \geq \lambda \right)\leq \frac{\mathbb {E}[\lVert X_\tau \rVert ]}{\lambda }. \]
Proof

By Corollary 7.14, \(\lVert X \rVert \) is a sub-martingale. Then apply Theorem 7.86.

Lemma 7.88 Doob’s Lp Inequality for stopping times

Let \(X:\mathbb {R}\times \Omega \rightarrow \mathbb {R}\) be a right-continuous non-negative sub-martingale. For every \(\lambda {\gt}0\) and \(p{\gt}1\) and \(\tau \) stopping time a.s. bounded by \(T{\gt}0\), we have

\[ \mathbb {E}\left[ \sup _{t\in [0,\tau ]}X_t^p \right]\leq \left(\frac{p}{p-1}\right)^p\mathbb {E}[X_\tau ^p]. \]
Proof

8.1.3 Pascucci.

Corollary 7.89 Doob’s Lp Inequality for stopping times in normed spaces

Let \(X:\mathbb {R}\times \Omega \rightarrow E\) be a right-continuous martingale with values in a normed space \(E\). For every \(\lambda {\gt}0\) and \(p{\gt}1\) and \(\tau \) stopping time a.s. bounded by \(T{\gt}0\), we have

\[ \mathbb {E}\left[ \sup _{t\in [0,\tau ]}\lVert X_t \rVert ^p \right]\leq \left(\frac{p}{p-1}\right)^p\mathbb {E}[\lVert X_\tau \rVert ^p]. \]
Proof

By Corollary 7.14, \(\lVert X \rVert \) is a sub-martingale. Then apply Theorem 7.88.