3 Projective family of the Brownian motion
3.1 Kolmogorov extension theorem
This theorem has been formalized in the repository kolmogorov_extension4.
A family of measures \(P\) indexed by finite sets of \(T\) is projective if, for finite sets \(J \subseteq I\), the projection from \(E^I\) to \(E^J\) maps \(P_I\) to \(P_J\).
A measure \(\mu \) on \(E^T\) is the projective limit of a projective family of measures \(P\) indexed by finite sets of \(T\) if, for every finite set \(I \subseteq T\), the projection from \(E^T\) to \(E^I\) maps \(\mu \) to \(P_I\).
Let \(\mathcal{X}\) be a Polish space, equipped with the Borel \(\sigma \)-algebra, and let \(T\) be an index set. Let \(P\) be a projective family of finite measures on \(\mathcal{X}\). Then the projective limit \(\mu \) of \(P\) exists, is unique, and is a finite measure on \(\mathcal{X}^T\). Moreover, if \(P_I\) is a probability measure for every finite set \(I \subseteq T\), then \(\mu \) is a probability measure.
3.2 Projective family of Gaussian measures
We build a projective family of Gaussian measures indexed by \(\mathbb {R}_+\). In order to do so, we need to define specific Gaussian measures on finite index sets \(\{ t_1, \ldots , t_n\} \). We want to build a multivariate Gaussian measure on \(\mathbb {R}^n\) with mean \(0\) and covariance matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\).
We prove that the matrix \(C_{ij} = \min (t_i, t_j)\) is positive semidefinite, which means that there exists a Gaussian distribution with mean 0 and covariance matrix \(C\).
Let \(v_1, \ldots , v_n\) be vectors in an inner product space \(E\). The Gram matrix of \(v_1, \ldots , v_n\) is the matrix in \(\mathbb {R}^{n \times n}\) with entries \(G_{ij} = \langle v_i, v_j \rangle \) for \(1 \leq i,j \leq n\).
A gram matrix is positive semidefinite.
Symmetry is obvious from the definition. Let \(x \in E\). Then
Let \(I = \{ t_1, \ldots , t_n\} \) be a finite subset of \(\mathbb {R}_+\). For \(i \le n\), let \(v_i = \mathbb {I}_{[0, t_i]}\) be the indicator function of the interval \([0, t_i]\), as an element of \(L^2(\mathbb {R})\). Then the Gram matrix of \(v_1, \ldots , v_n\) is equal to the matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\).
By definition of the inner product in \(L^2(\mathbb {R})\),
For \(I = \{ t_1, \ldots , t_n\} \) a finite subset of \(\mathbb {R}_+\), let \(C \in \mathbb {R}^{n \times n}\) be the matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\). Then \(C\) is positive semidefinite.
Definition of the projective family and extension
For \(I = \{ t_1, \ldots , t_n\} \) a finite subset of \(\mathbb {R}_+\), let \(P^B_I\) be the multivariate Gaussian measure on \(\mathbb {R}^n\) with mean \(0\) and covariance matrix \(C_{ij} = \min (t_i, t_j)\) for \(1 \leq i,j \leq n\). We call the family of measures \(P^B_I\) the projective family of the Brownian motion.
The projective family of the Brownian motion is a projective family of measures.
Let \(J \subseteq I\) be finite subsets of \(\mathbb {R}_+\). We need to show that the restriction from \(\mathbb {R}^I\) to \(\mathbb {R}^J\) (denote it by \(\pi _{IJ}\)) maps \(P^B_I\) to \(P^B_J\).
The restriction is a continuous linear map from \(\mathbb {R}^I\) to \(\mathbb {R}^J\). The map of a Gaussian measure by a continuous linear map is Gaussian (Lemma 2.8). It thus suffices to show that the mean and covariance matrix of the map are equal to the ones of \(P^B_J\).
The mean of the map is \(0\), since the mean of \(P^B_I\) is \(0\) and the map is linear.
For the covariance matrix and \(i, j \in J\), by Lemma 1.12 we have
\(\pi _{IJ}^\dagger (u)\) is the vector of \(\mathbb {R}^I\) with coordinates \((\pi _{IJ}^\dagger (u))_i = u_i\) if \(i \in J\) and \((\pi _{IJ}^\dagger (u))_i = 0\) otherwise. This gives the same covariance matrix as the one of \(P^B_J\).