Formalization of a Brownian motion and of stochastic integrals in Lean

11 Modifications with cadlag paths

11.1 Preliminaries

TODO: define cadlag.

11.2 Cadlag modifications of martingales

11.2.1 Upcrossings

Lemma 11.1
#

Let \(u : \beta \to \alpha \), for \(\alpha \) a densely ordered, conditionally complete linear order equipped with the order topology. Let \(S\) be a dense subset of \(\alpha \) and \(f\) a filter of \(\beta \). If for all \(a {\lt} b\) in \(S\), the number of upcrossings of the interval \([a, b]\) by \(u\) along \(f\) is finite, then \(u\) tends to a limit along \(f\).

Proof

Let \(X : T \to \Omega \to \mathbb {R}\) be a stochastic process on a finite time domain \(T\). Then for all \(a {\lt} b\) in \(\mathbb {R}\) and \(t \in T\), there exists an elementary predictable set \(A\) such that the number of upcrossings \(U_t[a, b]\) of the interval \([a, b]\) before time \(t\) satisfies

\begin{align*} (b - a) U_t[a, b] \le (\mathbb {1}_A \bullet X)_t + \max \{ a - X_t, 0\} \: . \end{align*}
Proof

See this chapter of [ Low ] .

In the proof, note that \(A\) can be written as a finite disjoint union of sets of the form \(\{ (t, \omega ) \mid \sigma (\omega ) {\lt} t \le \tau (\omega )\} \) for stopping times \(\sigma , \tau \). To see that each is an elementary predictable set, note that if \(T = \{ t_0, \ldots , t_n\} \) where \(t_0 {\lt} \cdots {\lt} t_n\), then it can be written as a finite disjoint union of sets

\begin{align*} \{ \sigma (\omega ) {\lt} t \le \tau (\omega )\} & = \bigcup _{k=1}^{n} {t_k} \times \{ \sigma {\lt} t_k \le \tau \} \\ & = \bigcup _{k=1}^{n} (t_{k-1}, t_k] \times \{ \sigma \le t_{k-1} {\lt} \tau \} \end{align*}

where \(\{ \sigma \le t_{k-1} {\lt} \tau \} \in \mathcal{F}_{t_{k-1}}\). Therefore \(A\) is an elementary predictable set.

11.2.2 Cadlag modifications

Let \(X : T \to \Omega \to \mathbb {R}\) be an adapted stochastic process such that \(X\) is integrable and for every \(t \in T\) the set \(\{ \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \mid A \text{ elementary predictable}\} \) is bounded. Then \(X\) has a modification \(Y\) which has left and right limits everywhere and such that there is a countable set \(S \subseteq T\) for which \(Y\) is right-continuous on \(T \setminus S\).

Proof

Let \(D\) be a countable dense subset of \(T\) (e.g. the rationals in \(T\)) and let \(t \in T\) and let \(S\) be a finite subset of \(D \cap [0, t]\). By Lemma 11.2, for all \(a {\lt} b\) in \(\mathbb {R}\), the number of upcrossings \(U_S[a, b]\) of the interval \([a, b]\) along \(S\) satisfies

\begin{align*} (b - a) U_S[a, b] \le (\mathbb {1}_A \bullet X)_t + \max \{ a - X_t, 0\} \end{align*}

for some elementary predictable set \(A\). Taking expectations and using the boundedness hypothesis, we obtain

\begin{align*} (b - a) \mathbb {E}[U_S[a, b]] \le L + \mathbb {E}[\max \{ a - X_t, 0\} ] \le L + |a| + \mathbb {E}[|X_t|] \end{align*}

where \(L\) is a bound for \(\{ \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \mid A \text{ elementary predictable}\} \).

Now, let \(S_1 \subseteq S_2 \subseteq \cdots \) be an increasing sequence of finite subsets of \(D \cap [0, t]\) with \(\bigcup _n S_n = D \cap [0, t]\). The number of upcrossings \(U_{S_n}[a, b]\) is increasing in \(n\). By monotone convergence,

\begin{align*} (b - a) \mathbb {E}[U_{D \cap [0, t]}[a, b]] = \lim _{n \to \infty } (b - a) \mathbb {E}[U_{S_n}[a, b]] \le L + |a| + \mathbb {E}[|X_t|] {\lt} \infty . \end{align*}

In particular, \(U_{D \cap [0, t]}[a, b] {\lt} \infty \) a.s. for each pair \(a {\lt} b\). Since \(\mathbb {Q}^2\) is countable, we conclude that almost surely, for all rational \(a {\lt} b\), the number of upcrossings of \([a, b]\) by \(X\) along \(D \cap [0, t]\) is finite.

By Lemma 11.1, finite upcrossings of all rational intervals implies that the left and right limits of \(s \mapsto X_s(\omega )\) along \(D\) exist at every \(t \in T\), for almost every \(\omega \). Define

\[ \tilde{Y}_t(\omega ) = \lim _{\substack {s \downarrow t \\ s \in D}} X_s(\omega ) \]

whenever this limit exists, and \(\tilde{Y}_t(\omega ) = 0\) otherwise. Then \(\tilde{Y}\) has left and right limits everywhere (along \(T\)) almost surely, and is right-continuous at every point of \(D\).

We claim that the set

\begin{align*} S = \{ t \in T \mid \tilde{Y}_t \ne X_t \text{ with positive probability}\} \end{align*}

is countable. For each \(n \in \mathbb {N}\), consider the set

\begin{align*} S_n = \{ t \in T \cap [0, n] \mid \mathbb {P}(|\tilde{Y}_t - X_t| {\gt} 1/n) {\gt} 1/n\} \: . \end{align*}

We have \(S = \bigcup _n S_n\), so it suffices to show each \(S_n\) is finite. If \(S_n\) were infinite, it would contain a monotone sequence \(t_k\) converging to some limit \(t^* \in [0, n]\). Adding \(\{ t_k\} \) to \(D\) and repeating the upcrossing argument, we would obtain that \(X_{t_k}\) converges along \(D \cup \{ t_k\} \) almost surely. But \(\tilde{Y}_{t_k}\) also converges (since \(\tilde{Y}\) has left/right limits), giving \(\tilde{Y}_{t_k} - X_{t_k} \to 0\) in probability, contradicting the definition of \(S_n\). Therefore each \(S_n\) is finite and \(S\) is countable.

Define

\begin{align*} Y_t = \begin{cases} X_t & \text{if } t \in S \: , \\ \tilde{Y}_t & \text{if } t \notin S \: . \end{cases}\end{align*}

Then \(Y_t = X_t\) almost surely for all \(t\): for \(t \in S\) this is by definition, and for \(t \notin S\) this follows from \(\tilde{Y}_t = X_t\) a.s. Thus \(Y\) is a modification of \(X\).

Since \(\tilde{Y}\) has left and right limits everywhere almost surely, and \(Y\) differs from \(\tilde{Y}\) only on the countable set \(S\), \(Y\) also has left and right limits everywhere almost surely. Furthermore, \(Y\) agrees with \(\tilde{Y}\) on \(T \setminus S\), so \(Y\) is right-continuous on \(T \setminus S\).

In the following, we say that a stochastic process \(X\) is right-continuous in probability if for all \(t \in T\), \(\lim _{s \downarrow t} X_s = X_t\) in which the limit is taken in probability.

Let \(X : T \to \Omega \to \mathbb {R}\) be an adapted stochastic process which is right-continuous in probability and such that the boundedness condition of Theorem 11.3 holds. Then \(X\) has a cadlag modification.

Proof

Let \(Y\) be the modification of \(X\) given by Theorem 11.3. \(Y\) has left and right limits everywhere and there is a countable set \(S\) such that \(Y\) is right-continuous on \(T \setminus S\). It remains to show that \(Y\) is right-continuous at the points of \(S\). Let \(Y_{t+}\) denote the right limit of \(Y\) at \(t\). Since \(X\) is right-continuous in probability, for all \(t \in S\), \(Y_t = Y_{t+}\) almost surely. Therefore, since \(S\) is countable, we have that almost surely \(Y_t = Y_{t+}\) for all \(t \in S\) (and hence for all \(t \in T\)). \(Y\) is therefore almost surely cadlag. We can modify \(Y\) on the null set to obtain a cadlag modification of \(X\).

Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale. Then for every \(t \in T\), the set \(\{ \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \mid A \text{ elementary predictable}\} \) is bounded.

Proof

Since \(X\) is a submartingale, for any elementary predictable set \(A\), we have (Corollary 10.26)

\begin{align*} 0 \le \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \le \mathbb {E}[X_t - X_0] \: . \end{align*}

As \(X_t\) and \(X_0\) are integrable, the set \(\{ \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \mid A \text{ elementary predictable}\} \) is bounded.

Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale which is right-continuous in probability. Then \(X\) has a cadlag modification.

Proof

We apply Corollary 11.4, in which the boundedness condition of Theorem 11.3 holds by Lemma 11.5.

Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale. Then \(X\) has a modification \(Y\) such that for all \(t \in T\), \(Y\) has left and right limits at \(t\) and such that there is a countable set \(S \subseteq T\) for which \(Y\) is right-continuous on \(T \setminus S\).

Proof

Apply Theorem 11.3, in which the boundedness condition holds by Lemma 11.5.

Lemma 11.8

Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale and let \(t_n\) be a decreasing sequence in \(T\) which is bounded below. Then the family \(\{ X_{t_n}\} _{n \in \mathbb {N}}\) is uniformly integrable.

Proof
Lemma 11.9

Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale and \(t \in T\). Let \(t_n\) be a decreasing sequence in \(T\) converging to \(t\), and such that \(X_{t_n}\) tends to a limit \(X_{t+}\) almost surely. Then \(X_t \le P[X_{t+} \mid \mathcal{F}_t]\) almost surely.

Proof

By the submartingale property, for all \(n\) we have that \(X_t \le P[X_{t_n} \mid \mathcal{F}_t]\) almost surely. By Lemma 11.8, the family \(\{ X_{t_n}\} _{n \in \mathbb {N}}\) is uniformly integrable.

TODO: conclude that \(P[X_{t+} \mid \mathcal{F}_t]\) is the limit of \(P[X_{t_n} \mid \mathcal{F}_t]\).

Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale with respect to a right-continuous filtration. Then \(X\) has a cadlag modification if and only if \(t \mapsto \mathbb {E}[X_t]\) is right-continuous.

Proof

Let \(Y\) be the modification of \(X\) given by Lemma 11.7. \(Y\) has left and right limits everywhere and there is a countable set \(S\) such that \(Y\) is right-continuous on \(T \setminus S\). It remains to show that for each \(t \in S\), \(Y\) is right-continuous at \(t\) almost surely. We will then be able to modify \(Y\) on a null set to obtain a cadlag modification of \(X\). Let \(t \in S\) and let \(Y_{t+}\) denote the right limit of \(Y\) at \(t\).

We show that \(Y_t = P[Y_{t+} \mid \mathcal{F}_t]\) almost surely, and that \(P[Y_{t+} \mid \mathcal{F}_t] = Y_{t+}\) almost surely, which will conclude the proof. For the second equality it suffices to show that \(Y_{t+}\) is \(\mathcal{F}_t\)-measurable, which follows from the right-continuity of the filtration.

For the first equality, it suffices to show that \(P[Y_{t+} \mid \mathcal{F}_t] - Y_t\) is an almost surely nonnegative random variable with zero expectation. Nonnegative follows from Lemma 11.9. The expectation is \(P[Y_{t+}] - P[Y_t]\). Let \(t_n\) be a decreasing sequence in \(T\) converging to \(t\). By right-continuity of \(t \mapsto \mathbb {E}[Y_t]\), we have that

\begin{align*} P[Y_t] = \lim _{n \to \infty } P[Y_{t_n}] \: . \end{align*}

By uniform integrability (Lemma 11.8), we have that

\begin{align*} P[Y_{t+}] = \lim _{n \to \infty } P[Y_{t_n}] \: . \end{align*}

Therefore \(P[Y_{t+}] = P[Y_t]\), which concludes the proof.

Let \(X : T \to \Omega \to \mathbb {R}\) be a martingale with respect to a right-continuous filtration. Then \(X\) has a cadlag modification.

Proof

\(X\) is in particular a submartingale, and for all \(t \in T\), we have that \(\mathbb {E}[X_t] = \mathbb {E}[X_0]\) by the martingale property. Therefore \(t \mapsto \mathbb {E}[X_t]\) is right-continuous, and we can apply Theorem 11.10.

11.3 Cadlag modifications of (local) martingales

TODO: this is partially or entirely redundant, consider removing this section.

Let the filtered probability space satisfy the usual conditions. Then every nonnegative submartingale \(X\) admits a modification that is still a nonnegative submartingale with cadlag trajectories.

Proof

See 8.2.3 of Pascucci.

Lemma 11.13

Let the filtered probability space satisfy the usual conditions. Then every local martingale \(X\) admits a modification that is still a local martingale with cadlag trajectories.

Proof