CLT

1 Convergence definitions

Definition 1.1 Convergence in probability
#

We say that \((X_i)_{i \in \mathbb {N}}\) tends to \(X\) in probability (or in measure) in a space with distance function \(d\) if for all \(\varepsilon {\gt} 0\)

\begin{align*} \mathbb {P}(d(X_n, X) \ge \varepsilon ) \to _{n \to +\infty } 0 \end{align*}

We write \(X_n \xrightarrow {p} X\).

Definition 1.2 Weak convergence of probability measures

In a topological space \(S\) in which the opens are measurable, we say that a sequence of probability measures \(\mu _n\) converges weakly to a probability measure \(\mu \) and write \(\mu _n \xrightarrow {w} \mu \) if \(\mu _n[f] \to \mu [f]\) for every bounded continuous function \(f : S \to \mathbb {R}\).

We write \(\mathcal L(X)\) for the law of a random variable \(X : \Omega \to S\). This is a measure on \(S\), defined as the map of the measure on \(\Omega \) by \(X\).

Definition 1.3 Convergence in distribution

We say that \(X_n\) converges in distribution to \(X\) and write \(X_n \xrightarrow {d} X\) if \(\mathcal L(X_n) \xrightarrow {w} \mathcal L(X)\).