Formalization of a Brownian motion and of stochastic integrals in Lean

10 Simple processes and elementary integrals

Definition 10.1 Simple process
#

Let \((s_k {\lt} t_k)_{k \in \{ 1, ..., n\} }\) be points in a linear order \(T\) with a bottom element 0. Let \((\eta _k)_{0 \le k \le n}\) be random variables with values in a normed space \(F\) such that \(\eta _0\) is \(\mathcal{F}_0\)-measurable and \(\eta _k\) is \(\mathcal{F}_{s_k}\)-measurable for \(k \ge 1\). Then the simple process for that sequence is the process \(V : T \to \Omega \to F\) defined by

\begin{align*} V_t = \eta _0 \mathbb {1}_{\{ 0\} }(t) + \sum _{k=1}^{n} \eta _k \mathbb {1}_{(s_k, t_k]}(t) \: . \end{align*}

Let \(\mathcal{E}_{T, F}\) be the set of simple processes indexed by \(T\) with value in \(F\).

Definition 10.2 Elementary predictable set
#

A set \(A \subseteq T \times \Omega \) is an elementary predictable set if it is a finite union of sets of the form \({0} \times B\) for \(B \in \mathcal{F}_0\) or of the form \((s, t] \times B\) with \(0 \le s {\lt} t\) in \(T\) and \(B \in \mathcal{F}_s\).

An elementary predictable set is measurable with respect to the predictable \(\sigma \)-algebra.

Proof

It is a union of sets of the form \({0} \times B\) with \(B \in \mathcal{F}_0\) or of the form \((s, t] \times B\) with \(B \in \mathcal{F}_s\), which are measurable by Lemma 7.45.

A simple process is predictable.

Proof

Real simple processes generate the predictable \(\sigma \)-algebra, i.e., the predictable \(\sigma \)-algebra is the supremum of the comaps by simple processes (seen as maps from \(T \times \Omega \) to \(\mathbb {R}\)) of the Borel \(\sigma \)-algebra.

Proof

A set \(A \subseteq T \times \Omega \) is an elementary predictable set if and only if the indicator function \(\mathbb {1}_A\) is a simple process.

Proof

Currently only proved the forward direction, because the backward direction is not easy to state.

Lemma 10.7

The simple processes \(\mathcal{E}_{T, F}\) form an additive commutative group.

Proof
Lemma 10.8

The simple processes \(\mathcal{E}_{T, F}\) form a module over the scalars \(\mathbb {R}\).

Proof
Definition 10.9 Elementary stochastic integral

Let \(V \in \mathcal{E}_{T, F}\) be a simple process and let \(X\) be a stochastic process with values in a normed space \(E\). Let \(B\) be a continuous bilinear map from \(E \times F\) to another normed space \(G\). The elementary stochastic integral process \(V \bullet X : T \to \Omega \to G\) is defined by

\begin{align*} (V \bullet X)_t & = \sum _{k=1}^{n} B (X^t_{t_k} - X^t_{s_k}, \eta _k) \: . \end{align*}

An important example is \(G = E\), \(F = \mathbb {R}\) and \(B(x, r) = r \cdot x\) the scalar multiplication.

Proof

(In Lean, this is split into several lemmas about each argument and add/sub/smul.)

Immediate from the definition.

Lemma 10.11

\(V \bullet (W \bullet X) = (V \times W) \bullet X\) for simple processes \(V, W\) and stochastic process \(X\).

TODO: that’s for \(B\) the scalar multiplication. What is the right statement for general \(B\)?

Proof

Unfold definitions, use linearity.

Let \(X\) be a stochastic process, \(V\) a simple process and \(τ\) a stopping time. Then

\begin{align*} (V \bullet X)^τ = V \bullet (X^τ) \: . \end{align*}
Proof

TODO: we are probably missing properties of stopped processes to make the following proof easy in Lean.

\begin{align*} (V \bullet X)^\tau _t & = (V \bullet X)_{t \wedge \tau } \\ & = \sum _{k=1}^{n} B (X^{t \wedge \tau }_{t_k} - X^{t \wedge \tau }_{s_k}, \eta _k) \\ & = \sum _{k=1}^{n} B ((X^{\tau })^t_{t_k} - (X^{\tau })^t_{s_k}, \eta _k) \\ & = (V \bullet (X^{\tau }))_t \: . \end{align*}

For \(V\) a simple process and \(X\) a stochastic process, \((V \bullet X)_t\) tends to its limit process \((V \bullet X)_\infty \) as \(t\) goes to infinity.

Proof

That process is eventually constant.

An adapted integrable process \(X\) is a submartingale if and only if for every bounded simple process \(V\) with values in \(\mathbb {R}_+\), \(\mathbb {E}[(V \bullet X)_\infty ] \ge 0\).

Proof

First suppose that \(X\) is a submartingale. The simple process \(V\) can be written as \(V_t = \eta _0 \mathbb {1}_{\{ 0\} }(t) + \sum _{k=1}^{n} \eta _k \mathbb {1}_{(s_k, t_k]}(t)\) for nonnegative \(\eta _k\). Then

\begin{align*} \mathbb {E}[(V \bullet X)_\infty ] & = \mathbb {E}\left[\sum _{k=1}^{n} \eta _k (X_{t_k} - X_{s_k})\right] \\ & = \sum _{k=1}^{n} \mathbb {E}[\eta _k (X_{t_k} - X_{s_k})] \: . \end{align*}

It suffices to show that each term of the sum is nonnegative. Since \(\eta _k\) is \(\mathcal{F}_{s_k}\)-measurable and nonnegative, by the submartingale property we have

\begin{align*} \mathbb {E}[\eta _k (X_{t_k} - X_{s_k})] & = \mathbb {E}[\eta _k \mathbb {E}[X_{t_k} - X_{s_k} \mid \mathcal{F}_{s_k}]] \ge 0 \: . \end{align*}

This concludes the proof in one direction. Suppose now that for every bounded simple process \(V\) with values in \(\mathbb {R}_+\), \(\mathbb {E}[(V \bullet X)_\infty ] \ge 0\). To show that \(X\) is a submartingale, let \(s {\lt} t\) in \(T\) and let \(A = \{ \mathbb {E}[X_t \mid \mathcal{F}_s] {\lt} X_s\} \in \mathcal{F}_s\). Define the simple process \(V\) by \(V_r = \mathbb {1}_A \mathbb {1}_{(s, t]}(r)\). Then

\begin{align*} \mathbb {E}[\mathbb {1}_A (X_t - X_s)] & = \mathbb {E}[(V \bullet X)_\infty ] \ge 0 \: . \end{align*}

But we also have

\begin{align*} \mathbb {E}[\mathbb {1}_A (X_t - X_s)] & = \mathbb {E}[\min \{ \mathbb {E}[X_t \mid \mathcal{F}_s] - X_s, 0\} ] \le 0 \: . \end{align*}

Combining both inequalities, we get \(\mathbb {E}[\min \{ \mathbb {E}[X_t \mid \mathcal{F}_s] - X_s, 0\} ] = 0\). Since \(\min \{ \mathbb {E}[X_t \mid \mathcal{F}_s] - X_s, 0\} \) is nonpositive and has expectation zero, it is almost surely zero. Thus \(\mathbb {E}[X_t \mid \mathcal{F}_s] \ge X_s\) almost surely, which concludes the proof.

An adapted integrable process \(X\) is a martingale if and only if for every bounded simple process \(V\) with values in \(\mathbb {R}\), \(\mathbb {E}[(V \bullet X)_\infty ] = 0\).

Proof

There might not be an order on the type \(E\) of values of \(X\), so we cannot just say that \(X\) is a martingale if and only if both \(X\) and \(-X\) are submartingales and conclude with Lemma 10.14. But we can use almost the same proof as in Lemma 10.14, replacing inequalities by equalities?

Let \(X\) be a submartingale and \(V\) be a nonnegative bounded simple process. Then the elementary stochastic integral \(V \bullet X\) is a submartingale.

Proof

We use Lemma 10.14 and have to show that for every nonnegative bounded simple process \(W\), \(\mathbb {E}[(W \bullet (V \bullet X))_\infty ] \ge 0\). By Lemma 10.11, this is \(\mathbb {E}[((W \times V) \bullet X)_\infty ]\). Since \(W \times V\) is a nonnegative bounded simple process, this is nonnegative by Lemma 10.14 applied to \(X\).

Let \(X\) be a martingale and \(V\) be a bounded simple process. Then the elementary stochastic integral \(V \bullet X\) is a martingale.

Proof

We use Lemma 10.15 and have to show that for every bounded simple process \(W\), \(\mathbb {E}[(W \bullet (V \bullet X))_\infty ] = 0\). By Lemma 10.11, this is \(\mathbb {E}[((W \times V) \bullet X)_\infty ]\). Since \(W \times V\) is a bounded simple process, this is zero by Lemma 10.15 applied to \(X\).

Lemma 10.18

Let \(X\) be a stochastic process and \(τ\) be a stopping time taking finitely many values. Then \(\mathbb {1}_{[0, τ]}\) is a simple process and

\begin{align*} X^τ = X_0 + \mathbb {1}_{[0, τ]} \bullet X \: . \end{align*}
Proof