5 Gaussian measures
A measure on \(\mathbb {R}\) is Gaussian if it is equal to \(\mathcal{N}(m, \sigma ^2)\) for some \(m \in \mathbb {R}\) and \(\sigma ^2 \ge 0\), in which \(\mathcal{N}(m, \sigma ^2)\) is the measure absolutely continuous with respect to the Lebesgue measure with density \(x \mapsto \frac{1}{\sqrt{2 \pi \sigma ^2}}e^{- \frac{1}{2\sigma ^2}(x - m)^2}\) if \(\sigma ^2{\gt}0\) and the Dirac probability measure at \(m\) if \(\sigma ^2 = 0\).
A real Gaussian measure is a probability measure.
The Gaussian distribution \(\mathcal N(m, \sigma ^2)\) has characteristic function \(\phi (t) = e^{itm - \sigma ^2 t^2 /2}\).
The sum of two independent real Gaussian random variables is Gaussian.
A Borel measure \(\mu \) on a separable Banach space \(E\) is said to be Gaussian if for all continuous linear maps \(\ell : E \to \mathbb {R}\), the pushforward \(\ell _*\mu \) is a real Gaussian measure.
A Gaussian measure is a probability measure.
The sum of two independent Gaussian random variables is Gaussian.
Let \(E\) be a finite dimensional real inner product space and let \(b_1, \ldots , b_d\) be an orthonormal basis of \(E\). Let \(X_1, \ldots , X_d\) be independent standard Gaussian random variables on \(\mathbb {R}\). Then the law of \(X_1 b_1 + \ldots + X_d b_d\) is a Gaussian measure on \(E\).