Types of convergence Convergence in distribution Let When for all continuity point Relation to functional analysis: convergence in distribution is pointwise convergence of the distribution functions on the set of continuity points. By abuse of notation, we extend this definition to sequence of random variables/vectors Convergence in probability Let When for all Since a random variable Difference between p and d convergence Example: let but There is a partial converse when the limit is a constant Bonus: Cramer-Wold Device As a side note, there is a link between univariate and multivariate Let Fundamental convergence theorems Law of large numbers Let Interpretation: since it is But what is the uncertainty associated with this approximation? Under slightly stronger assumptions on the sequence, the following theorem is the answer. Central limit theorem Let When the dimension is Let Interpretation: as the sample size Notice that the standard deviation shrinks at the speed of Weighted sum central limit theorem A more general version of the CLT is often useful when combined with the tools presented in the next section. Let Use the following notations: If, in the limit, any single component contributes a negligible proportion of the total variance, i.e: Then: Setting New approximations from old ones These theorems are used to approximate complicated distributions by simpler ones. Here are some transformation results that let us obtain new approximations from the old ones. Continuous mapping theorem Let Slutsky’s theorem Let The continuous mapping theorem would be applicable if the joint-distribution of The delta method Let Where derivative of derivative jacobian matrix.