Brownian Motion

4 Projective family of the Brownian motion

4.1 Kolmogorov extension theorem

This theorem has been formalized in the repository kolmogorov_extension4.

Definition 4.1 Projective family
#

A family of measures \(P\) indexed by finite sets of \(T\) is projective if, for finite sets \(J \subseteq I\), the projection from \(E^I\) to \(E^J\) maps \(P_I\) to \(P_J\).

Definition 4.2 Projective limit
#

A measure \(\mu \) on \(E^T\) is the projective limit of a projective family of measures \(P\) indexed by finite sets of \(T\) if, for every finite set \(I \subseteq T\), the projection from \(E^T\) to \(E^I\) maps \(\mu \) to \(P_I\).

Let \(\mathcal{X}\) be a Polish space, equipped with the Borel \(\sigma \)-algebra, and let \(T\) be an index set. Let \(P\) be a projective family of finite measures on \(\mathcal{X}\). Then the projective limit \(\mu \) of \(P\) exists, is unique, and is a finite measure on \(\mathcal{X}^T\). Moreover, if \(P_I\) is a probability measure for every finite set \(I \subseteq T\), then \(\mu \) is a probability measure.

Proof

4.2 Projective family of Gaussian measures

We build a projective family of Gaussian measures indexed by \(\mathbb {R}_+\). In order to do so, we need to define specific Gaussian measures on finite index sets \(\{ t_1, \ldots , t_n\} \). We want to build a multivariate Gaussian measure on \(\mathbb {R}^n\) with mean \(0\) and covariance matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\).

First method: Gaussian increments

In this method, we build the Gaussian measure by adding independent Gaussian increments.

Definition 4.4
#

(Gaussian increment) For \(v \ge 0\), the map from \(\mathbb {R}\) to the probability measures on \(\mathbb {R}\) defined by \(x \mapsto \mathcal{N}(x, v)\) is a Markov kernel. We call that kernel the Gaussian increment with variance \(v\) and denote it by \(\kappa ^G_v\).

TODO: perhaps the equality \(\mathcal{N}(x, v) = \delta _x \ast \mathcal{N}(0, v)\) is useful to show that it is a kernel?

Definition 4.5

Let \(0 \le t_1 \le \ldots \le t_n\) be non-negative reals. Let \(\mu _0\) be the real Gaussian distribution \(\mathcal{N}(0, t_1)\). For \(i \in \{ 1, \ldots , n-1\} \), let \(\kappa _i\) be the Markov kernel from \(\mathbb {R}\) to \(\mathbb {R}\) defined by \(\kappa _i(x) = \mathcal{N}(x, t_{i+1} - t_i)\) (the Gaussian increment \(\kappa ^G_{t_{i+1} - t_i}\)). Let \(P_{t_1, \ldots , t_n}\) be the measure on \(\mathbb {R}^n\) defined by \(\mu _0 \otimes \kappa _1 \otimes \ldots \otimes \kappa _{n-1}\).

TODO: explain the notation \(\otimes \) in the lemma above: \(\kappa _{n-1}\) takes the value at \(n-1\) only to produce the distribution at \(n\).

\(P_{t_1, \ldots , t_n}\) is a Gaussian measure on \(\mathbb {R}^n\) with mean \(0\) and covariance matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\).

Proof

Second method: covariance matrix

In this method, we prove that the matrix \(C_{ij} = \min (t_i, t_j)\) is positive semidefinite, which means that there exists a Gaussian distribution with mean 0 and covariance matrix \(C\).

Lemma 4.7
#

For \(I = \{ t_1, \ldots , t_n\} \) a finite subset of \(\mathbb {R}_+\), let \(C \in \mathbb {R}^{n \times n}\) be the matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\). Then \(C\) is positive semidefinite.

Proof

Definition of the projective family and extension

Definition 4.8 Projective family of the Brownian motion

For \(I = \{ t_1, \ldots , t_n\} \) a finite subset of \(\mathbb {R}_+\), let \(P^B_I\) be the multivariate Gaussian measure on \(\mathbb {R}^n\) with mean \(0\) and covariance matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\). We call the family of measures \(P^B_I\) the projective family of the Brownian motion.

The projective family of the Brownian motion is a projective family of measures.

Proof
Definition 4.10

We denote by TODO the projective limit of the projective family of the Brownian motion given by Theorem 4.3. This is a probability measure on \(\mathbb {R}^{\mathbb {R}_+}\).