10 Simple processes and elementary integrals
Let \((s_k {\lt} t_k)_{k \in \{ 1, ..., n\} }\) be points in a linear order \(T\) with a bottom element 0. Let \((\eta _k)_{0 \le k \le n}\) be random variables with values in a normed space \(F\) such that \(\eta _0\) is \(\mathcal{F}_0\)-measurable and \(\eta _k\) is \(\mathcal{F}_{s_k}\)-measurable for \(k \ge 1\). Then the simple process for that sequence is the process \(V : T \to \Omega \to F\) defined by
Let \(\mathcal{E}_{T, F}\) be the set of simple processes indexed by \(T\) with value in \(F\).
A set \(A \subseteq T \times \Omega \) is an elementary predictable set if it is a finite union of sets of the form \({0} \times B\) for \(B \in \mathcal{F}_0\) or of the form \((s, t] \times B\) with \(0 \le s {\lt} t\) in \(T\) and \(B \in \mathcal{F}_s\).
An elementary predictable set is measurable with respect to the predictable \(\sigma \)-algebra.
It is a union of sets of the form \({0} \times B\) with \(B \in \mathcal{F}_0\) or of the form \((s, t] \times B\) with \(B \in \mathcal{F}_s\), which are measurable by Lemma 7.45.
A simple process is predictable.
Real simple processes generate the predictable \(\sigma \)-algebra, i.e., the predictable \(\sigma \)-algebra is the supremum of the comaps by simple processes (seen as maps from \(T \times \Omega \) to \(\mathbb {R}\)) of the Borel \(\sigma \)-algebra.
A set \(A \subseteq T \times \Omega \) is an elementary predictable set if and only if the indicator function \(\mathbb {1}_A\) is a simple process.
Currently only proved the forward direction, because the backward direction is not easy to state.
The simple processes \(\mathcal{E}_{T, F}\) form an additive commutative group.
The simple processes \(\mathcal{E}_{T, F}\) form a module over the scalars \(\mathbb {R}\).
Let \(V \in \mathcal{E}_{T, F}\) be a simple process and let \(X\) be a stochastic process with values in a normed space \(E\). Let \(B\) be a continuous bilinear map from \(E \times F\) to another normed space \(G\). The elementary stochastic integral process \(V \bullet X : T \to \Omega \to G\) is defined by
An important example is \(G = E\), \(F = \mathbb {R}\) and \(B(x, r) = r \cdot x\) the scalar multiplication.
The elementary stochastic integral is linear in both arguments.
(In Lean, this is split into several lemmas about each argument and add/sub/smul.)
Immediate from the definition.
\(V \bullet (W \bullet X) = (V \times W) \bullet X\) for simple processes \(V, W\) and stochastic process \(X\).
TODO: that’s for \(B\) the scalar multiplication. What is the right statement for general \(B\)?
Unfold definitions, use linearity.
Let \(X\) be a stochastic process, \(V\) a simple process and \(τ\) a stopping time. Then
TODO: we are probably missing properties of stopped processes to make the following proof easy in Lean.
For \(V\) a simple process and \(X\) a stochastic process, \((V \bullet X)_t\) tends to its limit process \((V \bullet X)_\infty \) as \(t\) goes to infinity.
That process is eventually constant.
An adapted integrable process \(X\) is a submartingale if and only if for every bounded simple process \(V\) with values in \(\mathbb {R}_+\), \(\mathbb {E}[(V \bullet X)_\infty ] \ge 0\).
First suppose that \(X\) is a submartingale. The simple process \(V\) can be written as \(V_t = \eta _0 \mathbb {1}_{\{ 0\} }(t) + \sum _{k=1}^{n} \eta _k \mathbb {1}_{(s_k, t_k]}(t)\) for nonnegative \(\eta _k\). Then
It suffices to show that each term of the sum is nonnegative. Since \(\eta _k\) is \(\mathcal{F}_{s_k}\)-measurable and nonnegative, by the submartingale property we have
This concludes the proof in one direction. Suppose now that for every bounded simple process \(V\) with values in \(\mathbb {R}_+\), \(\mathbb {E}[(V \bullet X)_\infty ] \ge 0\). To show that \(X\) is a submartingale, let \(s {\lt} t\) in \(T\) and let \(A = \{ \mathbb {E}[X_t \mid \mathcal{F}_s] {\lt} X_s\} \in \mathcal{F}_s\). Define the simple process \(V\) by \(V_r = \mathbb {1}_A \mathbb {1}_{(s, t]}(r)\). Then
But we also have
Combining both inequalities, we get \(\mathbb {E}[\min \{ \mathbb {E}[X_t \mid \mathcal{F}_s] - X_s, 0\} ] = 0\). Since \(\min \{ \mathbb {E}[X_t \mid \mathcal{F}_s] - X_s, 0\} \) is nonpositive and has expectation zero, it is almost surely zero. Thus \(\mathbb {E}[X_t \mid \mathcal{F}_s] \ge X_s\) almost surely, which concludes the proof.
An adapted integrable process \(X\) is a martingale if and only if for every bounded simple process \(V\) with values in \(\mathbb {R}\), \(\mathbb {E}[(V \bullet X)_\infty ] = 0\).
Let \(X\) be a submartingale and \(V\) be a nonnegative bounded simple process. Then the elementary stochastic integral \(V \bullet X\) is a submartingale.
We use Lemma 10.14 and have to show that for every nonnegative bounded simple process \(W\), \(\mathbb {E}[(W \bullet (V \bullet X))_\infty ] \ge 0\). By Lemma 10.11, this is \(\mathbb {E}[((W \times V) \bullet X)_\infty ]\). Since \(W \times V\) is a nonnegative bounded simple process, this is nonnegative by Lemma 10.14 applied to \(X\).
Let \(X\) be a martingale and \(V\) be a bounded simple process. Then the elementary stochastic integral \(V \bullet X\) is a martingale.
We use Lemma 10.15 and have to show that for every bounded simple process \(W\), \(\mathbb {E}[(W \bullet (V \bullet X))_\infty ] = 0\). By Lemma 10.11, this is \(\mathbb {E}[((W \times V) \bullet X)_\infty ]\). Since \(W \times V\) is a bounded simple process, this is zero by Lemma 10.15 applied to \(X\).
Let \(X\) be a stochastic process and \(τ\) be a stopping time taking finitely many values. Then \(\mathbb {1}_{[0, τ]}\) is a simple process and