Brownian Motion

3 Projective family of the Brownian motion

3.1 Kolmogorov extension theorem

This theorem has been formalized in the repository kolmogorov_extension4.

Definition 3.1 Projective family
#

A family of measures \(P\) indexed by finite sets of \(T\) is projective if, for finite sets \(J \subseteq I\), the projection from \(E^I\) to \(E^J\) maps \(P_I\) to \(P_J\).

Definition 3.2 Projective limit
#

A measure \(\mu \) on \(E^T\) is the projective limit of a projective family of measures \(P\) indexed by finite sets of \(T\) if, for every finite set \(I \subseteq T\), the projection from \(E^T\) to \(E^I\) maps \(\mu \) to \(P_I\).

Let \(\mathcal{X}\) be a Polish space, equipped with the Borel \(\sigma \)-algebra, and let \(T\) be an index set. Let \(P\) be a projective family of finite measures on \(\mathcal{X}\). Then the projective limit \(\mu \) of \(P\) exists, is unique, and is a finite measure on \(\mathcal{X}^T\). Moreover, if \(P_I\) is a probability measure for every finite set \(I \subseteq T\), then \(\mu \) is a probability measure.

Proof

3.2 Projective family of Gaussian measures

We build a projective family of Gaussian measures indexed by \(\mathbb {R}_+\). In order to do so, we need to define specific Gaussian measures on finite index sets \(\{ t_1, \ldots , t_n\} \). We want to build a multivariate Gaussian measure on \(\mathbb {R}^n\) with mean \(0\) and covariance matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\).

We prove that the matrix \(C_{ij} = \min (t_i, t_j)\) is positive semidefinite, which means that there exists a Gaussian distribution with mean 0 and covariance matrix \(C\).

Definition 3.4 Gram matrix
#

Let \(v_1, \ldots , v_n\) be vectors in an inner product space \(E\). The Gram matrix of \(v_1, \ldots , v_n\) is the matrix in \(\mathbb {R}^{n \times n}\) with entries \(G_{ij} = \langle v_i, v_j \rangle \) for \(1 \leq i,j \leq n\).

Lemma 3.5
#

A gram matrix is positive semidefinite.

Proof

Symmetry is obvious from the definition. Let \(x \in E\). Then

\begin{align*} \langle x, G x \rangle & = \sum _{i,j} x_i x_j \langle v_i, v_j \rangle \\ & = \langle \sum _i x_i v_i, \sum _j x_j v_j \rangle \\ & = \left\Vert \sum _i x_i v_i \right\Vert ^2 \\ & \ge 0 \: . \end{align*}
Lemma 3.6

Let \(I = \{ t_1, \ldots , t_n\} \) be a finite subset of \(\mathbb {R}_+\). For \(i \le n\), let \(v_i = \mathbb {I}_{[0, t_i]}\) be the indicator function of the interval \([0, t_i]\), as an element of \(L^2(\mathbb {R})\). Then the Gram matrix of \(v_1, \ldots , v_n\) is equal to the matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\).

Proof

By definition of the inner product in \(L^2(\mathbb {R})\),

\begin{align*} \langle v_i, v_j \rangle & = \int _{\mathbb {R}} \mathbb {I}_{[0, t_i]}(x) \mathbb {I}_{[0, t_j]}(x) \: dx = \min (t_i, t_j) \: . \end{align*}
Lemma 3.7

For \(I = \{ t_1, \ldots , t_n\} \) a finite subset of \(\mathbb {R}_+\), let \(C \in \mathbb {R}^{n \times n}\) be the matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\). Then \(C\) is positive semidefinite.

Proof

\(C\) is a Gram matrix by Lemma 3.6. By Lemma 3.5, it is positive semidefinite.

Definition of the projective family and extension

Definition 3.8 Projective family of the Brownian motion

For \(I = \{ t_1, \ldots , t_n\} \) a finite subset of \(\mathbb {R}_+\), let \(P^B_I\) be the multivariate Gaussian measure on \(\mathbb {R}^n\) with mean \(0\) and covariance matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\). We call the family of measures \(P^B_I\) the projective family of the Brownian motion.

The projective family of the Brownian motion is a projective family of measures.

Proof

Let \(J \subseteq I\) be finite subsets of \(\mathbb {R}_+\). We need to show that the restriction from \(\mathbb {R}^I\) to \(\mathbb {R}^J\) (denote it by \(\pi _{IJ}\)) maps \(P^B_I\) to \(P^B_J\).

The restriction is a continuous linear map from \(\mathbb {R}^I\) to \(\mathbb {R}^J\). The map of a Gaussian measure by a continuous linear map is Gaussian (Lemma 2.8). It thus suffices to show that the mean and covariance matrix of the map are equal to the ones of \(P^B_J\).

The mean of the map is \(0\), since the mean of \(P^B_I\) is \(0\) and the map is linear.

For the covariance matrix and \(i, j \in J\), by Lemma 1.12 we have

\begin{align*} \langle e_i, \Sigma _{\pi _{IJ*}\mu } e_j\rangle & = \langle \pi _{IJ}^\dagger (e_i), \Sigma _\mu \pi _{IJ}^\dagger (e_j)\rangle \: . \end{align*}

\(\pi _{IJ}^\dagger (u)\) is the vector of \(\mathbb {R}^I\) with coordinates \((\pi _{IJ}^\dagger (u))_i = u_i\) if \(i \in J\) and \((\pi _{IJ}^\dagger (u))_i = 0\) otherwise. This gives the same covariance matrix as the one of \(P^B_J\).

Definition 3.10
#

We denote by \(P_B\) the projective limit of the projective family of the Brownian motion given by Theorem 3.3. This is a probability measure on \(\mathbb {R}^{\mathbb {R}_+}\).