Rylan Schaeffer

Logo About Me
Kernel Papers


Parametric Distributions

Many common distributions depend on specific parameters. Parameters are frequently classified into one of several possible types:

Some common discrete, continuous and special parametric distributions are:

Discrete Distributions

Continuous Distributions

Classes of Distribution

Probability Theory

Distances and Divergences

Probability distances and divergences have commonly encountered properties. Some common probability distances and divergences are

Probability Integral Transform

Theorem: For any random variable \(X\), its CDF \(F_X(x)\) is distributed uniformly over \((0,1)\). That is, if we define \(Y = F_X(x)\), then \(Y \sim \mathcal{U}(0,1)\).

Proof: $$ \begin{align*} P(Y \leq y) &= P(F_X(x) \leq y)\\ &= P(x \leq F_X^{-1}(y))\\ &= F_X(F_X^{-1}(y))\\ &= y \end{align*} Since only $$\mathcal{U}(0,1)$$ has a CDF $$F_Y(y) = P(Y \leq y) = y$$, we conclude that $$Y$$ is distributed uniformly.

Notions of Convergence

\[\lim_{n \rightarrow \infty} P(\lvert X_n - X\lvert < \epsilon) = 1\]

The Weak Law of Large Numbers states that if the set of random variables \(\{X_i\}_{i=1}^N\) are i.i.d. with \(\mathbb{E}_X[X_i] = \mu < \infty\) and \(\mathbb{V}_X[X_i] = \sigma^2 < \infty\), then the sample mean \(\frac{1}{N} \sum_{i=1}^N X_i\) converges in probability to the expected value.

Proof: Use [Chebyshev's Inequality](#chebychevs-inequality): $$ \begin{align*} P(\lvert\bar{X}_n - \mu\lvert \geq \epsilon ) &= P(\lvert\bar{X}_n - \mu\lvert^2 \geq \epsilon^2 )\\ &\leq \frac{\mathbb{E}_x[(\bar{X}_n - \mu)^2]}{\epsilon^2}\\ &= \mathbb{V}_x[\bar{X}] / \epsilon^2\\ &= \sigma^2 / n \epsilon^2 \end{align*} $$ Then, taking the limit as $$n \rightarrow \infty$$: $$ \lim{n \rightarrow \infty} P(\lvert\bar{X}_n - \mu\lvert < \epsilon) < 1 - \lim_{n \rightarrow \infty} \frac{\sigma^2}{n \epsilon^2} = 1$$
\[P(\lim_{n \rightarrow \infty} \lvert X_n - X\lvert < \epsilon) = 1\]

Convergence almost surely implies convergence in probability.