10 Simple processes and elementary integrals
Let \((s_k {\lt} t_k)_{k \in \{ 1, ..., n\} }\) be points in a linear order \(T\) with a bottom element 0. Let \((\eta _k)_{0 \le k \le n}\) be bounded random variables with values in a normed real vector space \(E\) such that \(\eta _0\) is \(\mathcal{F}_0\)-measurable and \(\eta _k\) is \(\mathcal{F}_{s_k}\)-measurable for \(k \ge 1\). Then the simple process for that sequence is the process \(V : T \to \Omega \to E\) defined by
Let \(\mathcal{E}_{T, E}\) be the set of simple processes indexed by \(T\) with value in \(E\).
A set \(A \subseteq T \times \Omega \) is an elementary predictable set if it is a finite union of sets of the form \({0} \times B\) for \(B \in \mathcal{F}_0\) or of the form \((s, t] \times B\) with \(0 \le s {\lt} t\) in \(T\) and \(B \in \mathcal{F}_s\).
An elementary predictable set is measurable with respect to the predictable \(\sigma \)-algebra.
It is a union of sets of the form \({0} \times B\) with \(B \in \mathcal{F}_0\) or of the form \((s, t] \times B\) with \(B \in \mathcal{F}_s\), which are measurable by Lemma 7.47.
A simple process is predictable.
Real simple processes generate the predictable \(\sigma \)-algebra, i.e., the predictable \(\sigma \)-algebra is the supremum of the comaps by simple processes (seen as maps from \(T \times \Omega \) to \(\mathbb {R}\)) of the Borel \(\sigma \)-algebra.
A set \(A \subseteq T \times \Omega \) is an elementary predictable set if and only if the indicator function \(\mathbb {1}_A\) is a simple process.
Currently only proved the forward direction, because the backward direction is not easy to state.
The simple processes \(\mathcal{E}_{T, F}\) form an additive commutative group.
The simple processes \(\mathcal{E}_{T, F}\) form a module over the scalars \(\mathbb {R}\).
Let \(V \in \mathcal{E}_{T, E}\) be a simple process and let \(X\) be a stochastic process with values in a normed space \(F\). Let \(B\) be a continuous bilinear map from \(E \times F\) to another normed space \(G\). The general elementary stochastic integral process \(V \bullet _B X : T \to \Omega \to G\) is defined by
For \(V, W\) two simple processes with values in \(E\) and \(F\) respectively, the process defined by \(B(V, W)_t(\omega ) = B(V_t(\omega ), W_t(\omega ))\) is a simple process.
Note: in Lean, the above is not a lemma but a definition of a simple process with a lemma that its coercion to a function is given by above.
It has values \(B(\eta _i, \xi _j)\) on the intervals \((s_i, t_i] \cap (u_j, v_j]\) if \(V\) and \(W\) are defined by the sequences \((s_i, t_i, \eta _i)\) and \((u_j, v_j, \xi _j)\) respectively.
Let \(E\) and \(F\) be normed real vector spaces. We call linear elementary stochastic integral the general elementary stochastic integral of Definition 10.9 with \(V\) being valued in \(L(E, F)\)-valued and \(B(L, x) = L(x)\) the evaluation map. We denote it by \(V \bullet _L X\).
We call scalar elementary stochastic integral the general elementary stochastic integral of Definition 10.9 with \(V\) being \(\mathbb {R}\)-valued and \(B(r, x) = r \cdot x\) the scalar multiplication. We denote it by \(V \bullet _{\mathbb {R}} X\).
The elementary stochastic integrals \(\bullet _B\), \(\bullet _L\) and \(\bullet _{\mathbb {R}}\) are linear in both arguments.
(In Lean, this is split into several lemmas about each argument and add/sub/smul.)
Immediate from the definition.
\((V \bullet _B X)_0 = 0\) for every simple process \(V\) and stochastic process \(X\).
Let \(X_c\) be a constant process (equal to the same random variable for all times). Then for every simple process \(V\),
Let \(E, F, G, H, I, J\) be normed real vector spaces. Let \(B_1 : E \times H \to I\), \(B_2 : F \times G \to H\), \(B_3 : E \times F \to J\) and \(B_4 : J \times G \to I\) be continuous bilinear maps such that for all \(x \in E\), \(y \in F\) and \(z \in G\), \(B_1(x, B_2(y, z)) = B_4(B_3(x, y), z)\). Then for every simple process \(V\) with values in \(E\), simple process \(W\) with values in \(F\), and stochastic process \(X\) with values in \(G\), we have
Unfold definitions, use linearity.
Let \(E, F, G\) be normed real vector spaces. Let \(B : E \times F \to G\) be a continuous bilinear map. Then for every simple process \(V\) with values in \(\mathbb {R}\), simple process \(W\) with values in \(E\), and stochastic process \(X\) with values in \(F\), we have
Use Lemma 10.16.
\(V \bullet _{\mathbb {R}} (W \bullet _{\mathbb {R}} X) = (V \times W) \bullet _{\mathbb {R}} X\) for real-valued simple processes \(V, W\) and stochastic process \(X\).
Apply Corollary 10.17 with \(B\) the scalar multiplication.
For simple process \(W\) with values in \(L(E, F)\), simple process \(V\) with values in \(L(F, G)\), and stochastic process \(X\) with values in \(E\), we have
in which \(V \circ W\) is the simple process defined by \((V \circ W)_t = V_t \circ W_t\).
Use Lemma 10.16 with \(B_1, B_2, B_4\) equal to the evaluation maps and \(B_3\) the composition of continuous linear maps.
For \(V \in \mathcal{E}_{T, F}\), \(X : T \to \Omega \to E\) a càdlàg process and a continuous bilinear map \(B: E \times F \to G\), the elementary stochastic integral \(V \bullet _B X\) is càdlàg.
Let \(X\) be a stochastic process, \(V\) a simple process and \(τ\) a stopping time. Then
For \(V\) a simple process and \(X\) a stochastic process, \((V \bullet _B X)_t\) tends to its limit process \((V \bullet _B X)_\infty \) as \(t\) goes to infinity.
That process is eventually constant.
Immediate from the definition.
An adapted integrable process \(X\) is a submartingale if and only if for every bounded simple process \(V\) with values in \(\mathbb {R}_+\), \(\mathbb {E}[(V \bullet _{\mathbb {R}} X)_\infty ] \ge 0\).
Note that in Lean \(V\) comes with the data of a sum representation: by “values in \(\mathbb {R}_+\)” we mean that the random variables in that particular sum representing \(V\) are nonnegative.
First suppose that \(X\) is a submartingale. The simple process \(V\) can be written as \(V_t = \eta _0 \mathbb {1}_{\{ 0\} }(t) + \sum _{k=1}^{n} \eta _k \mathbb {1}_{(s_k, t_k]}(t)\) for nonnegative \(\eta _k\). Then
It suffices to show that each term of the sum is nonnegative. Since \(\eta _k\) is \(\mathcal{F}_{s_k}\)-measurable and nonnegative, by the submartingale property we have
This concludes the proof in one direction. Suppose now that for every bounded simple process \(V\) with values in \(\mathbb {R}_+\), \(\mathbb {E}[(V \bullet _{\mathbb {R}} X)_\infty ] \ge 0\). To show that \(X\) is a submartingale, let \(s {\lt} t\) in \(T\) and let \(A = \{ \mathbb {E}[X_t \mid \mathcal{F}_s] {\lt} X_s\} \in \mathcal{F}_s\). Define the simple process \(V\) by \(V_r = \mathbb {1}_A \mathbb {1}_{(s, t]}(r)\). Then
But we also have
Combining both inequalities, we get \(\mathbb {E}[\min \{ \mathbb {E}[X_t \mid \mathcal{F}_s] - X_s, 0\} ] = 0\). Since \(\min \{ \mathbb {E}[X_t \mid \mathcal{F}_s] - X_s, 0\} \) is nonpositive and has expectation zero, it is almost surely zero. Thus \(\mathbb {E}[X_t \mid \mathcal{F}_s] \ge X_s\) almost surely, which concludes the proof.
An adapted integrable process \(X\) is a martingale if and only if for every bounded simple process \(V\) with values in \(\mathbb {R}\), \(\mathbb {E}[(V \bullet _{\mathbb {R}} X)_\infty ] = 0\).
Let \(X\) be a submartingale and \(A\) be an elementary predictable set. Then for all \(t \in T\),
Let \(A\) be an elementary predictable set and let \(t \in T\).
The inequality follows from Lemma 10.24 applied to the nonnegative bounded simple process \(\mathbb {1}_{A^c}\).
Let \(X\) be a submartingale and \(V\) be a nonnegative bounded simple process. Then the elementary stochastic integral \(V \bullet _{\mathbb {R}} X\) is a submartingale.
We use Lemma 10.24 and have to show that for every nonnegative bounded simple process \(W\), \(\mathbb {E}[(W \bullet _{\mathbb {R}} (V \bullet _{\mathbb {R}} X))_\infty ] \ge 0\). By Lemma 10.18, this is \(\mathbb {E}[((W \times V) \bullet _{\mathbb {R}} X)_\infty ]\). Since \(W \times V\) is a nonnegative bounded simple process, this is nonnegative by Lemma 10.24 applied to \(X\).
Let \(X\) be a martingale and \(V\) be a bounded simple process. Then the elementary stochastic integral \(V \bullet _B X\) is a martingale.
We use Lemma 10.25 and have to show that for every bounded simple process \(W\), \(\mathbb {E}[(W \bullet _{\mathbb {R}} (V \bullet _B X))_\infty ] = 0\). By Lemma 10.17, this is \(\mathbb {E}[((W \bullet _{\mathbb {R}} V) \bullet _B X)_\infty ]\). Since \(W \bullet _{\mathbb {R}} V\) is a bounded simple process, this is zero by Lemma 10.25 applied to \(X\).
Let \(X\) be a stochastic process and \(τ\) be a stopping time taking finitely many values. Then \(\mathbb {1}_{[0, τ]}\) is a simple process and