## § Convergence in distribution is very weak

- consider $X \sim N(0, 1)$. Also consider $-X$ which will be identically distributed (by symmetry of $-$ and $N$).
- So we have that $-X \sim N(0, 1)$.
- But this tells us nothing about $X$ and $-x$! so this type of "convergence of distribution" is very weak.
- Strongest notion of convergence (#2): Almost surely. $T_n \xrightarrow{a.s} T$ iff $P(\{ \omega : T_n(\omega) \to T(\omega) \}) = 1$. Consider a snowball left out in the sun. In a couple hours, It'll have a random shape, random volume, and so on. But the ball itself is a definite thing --- the $\omega$. Almost sure says that for almost all of the balls, $T_n$ converges to $T$.
- #2 notion of convergence: Convergence in probability. $T_n \xrightarrow{P} T$ iff $P(|T_n - T| \geq \epsilon) \xrightarrow{n \to \infty} 0$ for all $\epsilon > 0$. This allows us to squeeze $\epsilon$ probability under the rug.
- Convergence in $L^p$: $T_n \xrightarrow{L^p} T$ iff $E[|T_n - T|^p] \xrightarrow{n \to \infty} 0$. Eg. think of convergence in variance of a gaussian.
- Convergence in distrbution: (weakest): $T_n \xrightarrow{d} T$ iff $P[T_n \leq x] \xrightarrow{n \to \infty} P[T \leq x]$ for all $x$.

#### § Characterization of convergence in distribution

- (1) $T_n \xrightarrow{d} T$
- (2) For all $f$ continuous and bounded, we have $E[f(T_n)] \xrightarrow{n \to \infty} E[f(T)]$.
- (2) we have $E[e^{ixT_n}] \xrightarrow{n \to \infty} E[e^{ixT}]$. [characteristic function converges ].

#### § Strength of different types of convergence

- Almost surely convergence implies convergence in probability. Also, the two limits (which are RVs) are almost surely equal.
- Convergence in $L^p$ implies convergence in probability and convergence in $L^q$ for all $q \leq p$. Also, the limits (which are RVs) are almost surely equal.
- If $T$ converges in probability, it also converges in distribution (meaning the two sequences will have the same DISTRIBUTION, not same RV).
- All of almost surely, probabilistic convergence, convergence in distribution (not $L^p$) map properly by continuous fns. $T_n \to T$ implies $f(T_n) \to f(T)$.
- almost surely implies P implies distribution convergence.

#### § Slutsky's Theorem

- If $X_n \xrightarrow{d} X$ and $Y_n \xrightarrow{P} c$ (That is, the sequence of $Y_n$ is eventually deterministic),we then have that $(X_n, Y) \xrightarrow{d} (X, c)$. In particular, we get that $X_n + Y_n \xrightarrow{d} X + c$ and $X_n Y_n \xrightarrow{d} X c$.
- This is important, because in general, convergence in distribution says nothing about the RV! but in this special case, it's possible.

#### § References