Rylan Schaeffer

Logo
Resume
Research
Learning
Blog
Teaching
Jokes
Kernel Papers


Chernoff Bounds

Let \(X\) be a random variable. By applying Markov’s Inequality to the moment generating function of \(X\), we obtain Chernoff bounds i.e. \(\forall t > 0\):

\[\mathbb{P}[X \geq a] = \mathbb{P}[e^{t X} \geq e^{t a}] \leq \frac{\mathbb{E}[e^{t X}]}{e^{t a}}\]

Properties

\[\mathbb{P}[X \geq (1 + \delta) \mu] \leq \Big( \frac{e^{\delta}}{(1 + \delta)^{1+\delta}} \Big)^{\mu}\]

and

\[\forall \delta > 0, \mathbb{P}[X \leq (1-\delta) \mu] \leq \Big( \frac{e^{-\delta}}{(1 - \delta)^{1-\delta}} \Big)^{\mu}\]

Examples

Fair Coin Flips

Let \(X\) be the number of heads after \(n\) coin flips of a fair coin, and consider the probability that \(X \geq 3n/4\). Markov’s Inequality tells us that the probability is:

\[\mathbb{P}[X \geq 3n/4] \leq \frac{\mathbb{E}[X]}{3n/4} = \frac{n/2}{3n/4} = \frac{2}{3}\]

Intuitively, that feels loose, especially if \(n\) is large. What does [Chebyshev’s Inequality] tell us?

\[\mathbb{P}[X \geq 3n/4] \leq \frac{4}{n}\]

That’s inversely related to \(n\), which is Better! But intuitively, we expect that \(\exp(-\Omega(n))\) should be possible. Using Chernoff’s gives us exponential decay in \(n\).