CLT

1 Convergence definitions

Definition 1.1 Weak convergence of probability measures

In a topological space \(S\) in which the opens are measurable, we say that a sequence of probability measures \(\mu _n\) converges weakly to a probability measure \(\mu \) and write \(\mu _n \xrightarrow {w} \mu \) if \(\mu _n[f] \to \mu [f]\) for every bounded continuous function \(f : S \to \mathbb {R}\).

We write \(\mathcal L(X)\) for the law of a random variable \(X : \Omega \to S\). This is a measure on \(S\), defined as the map of the measure on \(\Omega \) by \(X\).

Definition 1.2 Convergence in distribution

We say that \(X_n\) converges in distribution to \(X\) and write \(X_n \xrightarrow {d} X\) if \(\mathcal L(X_n) \xrightarrow {w} \mathcal L(X)\).