11 Modifications with cadlag paths
11.1 Preliminaries
TODO: define cadlag.
11.2 Cadlag modifications of martingales
11.2.1 Upcrossings
Let \(u : \beta \to \alpha \), for \(\alpha \) a densely ordered, conditionally complete linear order equipped with the order topology. Let \(S\) be a dense subset of \(\alpha \) and \(f\) a filter of \(\beta \). If for all \(a {\lt} b\) in \(S\), the number of upcrossings of the interval \([a, b]\) by \(u\) along \(f\) is finite, then \(u\) tends to a limit along \(f\).
Let \(X : T \to \Omega \to \mathbb {R}\) be a stochastic process on a finite time domain \(T\). Then for all \(a {\lt} b\) in \(\mathbb {R}\) and \(t \in T\), there exists an elementary predictable set \(A\) such that the number of upcrossings \(U_t[a, b]\) of the interval \([a, b]\) before time \(t\) satisfies
See this chapter of [ Low ] .
In the proof, note that \(A\) can be written as a finite disjoint union of sets of the form \(\{ (t, \omega ) \mid \sigma (\omega ) {\lt} t \le \tau (\omega )\} \) for stopping times \(\sigma , \tau \). To see that each is an elementary predictable set, note that if \(T = \{ t_0, \ldots , t_n\} \) where \(t_0 {\lt} \cdots {\lt} t_n\), then it can be written as a finite disjoint union of sets
where \(\{ \sigma \le t_{k-1} {\lt} \tau \} \in \mathcal{F}_{t_{k-1}}\). Therefore \(A\) is an elementary predictable set.
11.2.2 Cadlag modifications
Let \(X : T \to \Omega \to \mathbb {R}\) be an adapted stochastic process such that \(X\) is integrable and for every \(t \in T\) the set \(\{ \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \mid A \text{ elementary predictable}\} \) is bounded. Then \(X\) has a modification \(Y\) which has left and right limits everywhere and such that there is a countable set \(S \subseteq T\) for which \(Y\) is right-continuous on \(T \setminus S\).
Let \(D\) be a countable dense subset of \(T\) (e.g. the rationals in \(T\)) and let \(t \in T\) and let \(S\) be a finite subset of \(D \cap [0, t]\). By Lemma 11.2, for all \(a {\lt} b\) in \(\mathbb {R}\), the number of upcrossings \(U_S[a, b]\) of the interval \([a, b]\) along \(S\) satisfies
for some elementary predictable set \(A\). Taking expectations and using the boundedness hypothesis, we obtain
where \(L\) is a bound for \(\{ \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \mid A \text{ elementary predictable}\} \).
Now, let \(S_1 \subseteq S_2 \subseteq \cdots \) be an increasing sequence of finite subsets of \(D \cap [0, t]\) with \(\bigcup _n S_n = D \cap [0, t]\). The number of upcrossings \(U_{S_n}[a, b]\) is increasing in \(n\). By monotone convergence,
In particular, \(U_{D \cap [0, t]}[a, b] {\lt} \infty \) a.s. for each pair \(a {\lt} b\). Since \(\mathbb {Q}^2\) is countable, we conclude that almost surely, for all rational \(a {\lt} b\), the number of upcrossings of \([a, b]\) by \(X\) along \(D \cap [0, t]\) is finite.
By Lemma 11.1, finite upcrossings of all rational intervals implies that the left and right limits of \(s \mapsto X_s(\omega )\) along \(D\) exist at every \(t \in T\), for almost every \(\omega \). Define
whenever this limit exists, and \(\tilde{Y}_t(\omega ) = 0\) otherwise. Then \(\tilde{Y}\) has left and right limits everywhere (along \(T\)) almost surely, and is right-continuous at every point of \(D\).
We claim that the set
is countable. For each \(n \in \mathbb {N}\), consider the set
We have \(S = \bigcup _n S_n\), so it suffices to show each \(S_n\) is finite. If \(S_n\) were infinite, it would contain a monotone sequence \(t_k\) converging to some limit \(t^* \in [0, n]\). Adding \(\{ t_k\} \) to \(D\) and repeating the upcrossing argument, we would obtain that \(X_{t_k}\) converges along \(D \cup \{ t_k\} \) almost surely. But \(\tilde{Y}_{t_k}\) also converges (since \(\tilde{Y}\) has left/right limits), giving \(\tilde{Y}_{t_k} - X_{t_k} \to 0\) in probability, contradicting the definition of \(S_n\). Therefore each \(S_n\) is finite and \(S\) is countable.
Define
Then \(Y_t = X_t\) almost surely for all \(t\): for \(t \in S\) this is by definition, and for \(t \notin S\) this follows from \(\tilde{Y}_t = X_t\) a.s. Thus \(Y\) is a modification of \(X\).
Since \(\tilde{Y}\) has left and right limits everywhere almost surely, and \(Y\) differs from \(\tilde{Y}\) only on the countable set \(S\), \(Y\) also has left and right limits everywhere almost surely. Furthermore, \(Y\) agrees with \(\tilde{Y}\) on \(T \setminus S\), so \(Y\) is right-continuous on \(T \setminus S\).
In the following, we say that a stochastic process \(X\) is right-continuous in probability if for all \(t \in T\), \(\lim _{s \downarrow t} X_s = X_t\) in which the limit is taken in probability.
Let \(X : T \to \Omega \to \mathbb {R}\) be an adapted stochastic process which is right-continuous in probability and such that the boundedness condition of Theorem 11.3 holds. Then \(X\) has a cadlag modification.
Let \(Y\) be the modification of \(X\) given by Theorem 11.3. \(Y\) has left and right limits everywhere and there is a countable set \(S\) such that \(Y\) is right-continuous on \(T \setminus S\). It remains to show that \(Y\) is right-continuous at the points of \(S\). Let \(Y_{t+}\) denote the right limit of \(Y\) at \(t\). Since \(X\) is right-continuous in probability, for all \(t \in S\), \(Y_t = Y_{t+}\) almost surely. Therefore, since \(S\) is countable, we have that almost surely \(Y_t = Y_{t+}\) for all \(t \in S\) (and hence for all \(t \in T\)). \(Y\) is therefore almost surely cadlag. We can modify \(Y\) on the null set to obtain a cadlag modification of \(X\).
Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale. Then for every \(t \in T\), the set \(\{ \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \mid A \text{ elementary predictable}\} \) is bounded.
Since \(X\) is a submartingale, for any elementary predictable set \(A\), we have (Corollary 10.26)
As \(X_t\) and \(X_0\) are integrable, the set \(\{ \mathbb {E}[(\mathbb {1}_A \bullet X)_t] \mid A \text{ elementary predictable}\} \) is bounded.
Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale which is right-continuous in probability. Then \(X\) has a cadlag modification.
Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale. Then \(X\) has a modification \(Y\) such that for all \(t \in T\), \(Y\) has left and right limits at \(t\) and such that there is a countable set \(S \subseteq T\) for which \(Y\) is right-continuous on \(T \setminus S\).
Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale and let \(t_n\) be a decreasing sequence in \(T\) which is bounded below. Then the family \(\{ X_{t_n}\} _{n \in \mathbb {N}}\) is uniformly integrable.
Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale and \(t \in T\). Let \(t_n\) be a decreasing sequence in \(T\) converging to \(t\), and such that \(X_{t_n}\) tends to a limit \(X_{t+}\) almost surely. Then \(X_t \le P[X_{t+} \mid \mathcal{F}_t]\) almost surely.
By the submartingale property, for all \(n\) we have that \(X_t \le P[X_{t_n} \mid \mathcal{F}_t]\) almost surely. By Lemma 11.8, the family \(\{ X_{t_n}\} _{n \in \mathbb {N}}\) is uniformly integrable.
TODO: conclude that \(P[X_{t+} \mid \mathcal{F}_t]\) is the limit of \(P[X_{t_n} \mid \mathcal{F}_t]\).
Let \(X : T \to \Omega \to \mathbb {R}\) be a submartingale with respect to a right-continuous filtration. Then \(X\) has a cadlag modification if and only if \(t \mapsto \mathbb {E}[X_t]\) is right-continuous.
Let \(Y\) be the modification of \(X\) given by Lemma 11.7. \(Y\) has left and right limits everywhere and there is a countable set \(S\) such that \(Y\) is right-continuous on \(T \setminus S\). It remains to show that for each \(t \in S\), \(Y\) is right-continuous at \(t\) almost surely. We will then be able to modify \(Y\) on a null set to obtain a cadlag modification of \(X\). Let \(t \in S\) and let \(Y_{t+}\) denote the right limit of \(Y\) at \(t\).
We show that \(Y_t = P[Y_{t+} \mid \mathcal{F}_t]\) almost surely, and that \(P[Y_{t+} \mid \mathcal{F}_t] = Y_{t+}\) almost surely, which will conclude the proof. For the second equality it suffices to show that \(Y_{t+}\) is \(\mathcal{F}_t\)-measurable, which follows from the right-continuity of the filtration.
For the first equality, it suffices to show that \(P[Y_{t+} \mid \mathcal{F}_t] - Y_t\) is an almost surely nonnegative random variable with zero expectation. Nonnegative follows from Lemma 11.9. The expectation is \(P[Y_{t+}] - P[Y_t]\). Let \(t_n\) be a decreasing sequence in \(T\) converging to \(t\). By right-continuity of \(t \mapsto \mathbb {E}[Y_t]\), we have that
By uniform integrability (Lemma 11.8), we have that
Therefore \(P[Y_{t+}] = P[Y_t]\), which concludes the proof.
Let \(X : T \to \Omega \to \mathbb {R}\) be a martingale with respect to a right-continuous filtration. Then \(X\) has a cadlag modification.
\(X\) is in particular a submartingale, and for all \(t \in T\), we have that \(\mathbb {E}[X_t] = \mathbb {E}[X_0]\) by the martingale property. Therefore \(t \mapsto \mathbb {E}[X_t]\) is right-continuous, and we can apply Theorem 11.10.
11.3 Cadlag modifications of (local) martingales
TODO: this is partially or entirely redundant, consider removing this section.
Let the filtered probability space satisfy the usual conditions. Then every nonnegative submartingale \(X\) admits a modification that is still a nonnegative submartingale with cadlag trajectories.
See 8.2.3 of Pascucci.
Let the filtered probability space satisfy the usual conditions. Then every local martingale \(X\) admits a modification that is still a local martingale with cadlag trajectories.