🏠
Guest Not signed in

Concentration inequalities: tail bounds without distribution

Tail bounds (Markov, Chebyshev, Hoeffding) tell you how unlikely a deviation is β€” without knowing the distribution. Cheaper assumptions, looser bounds.

Method · Concentration Inequalities
Intro

Markov, Chebyshev, Hoeffding β€” three tail bounds, three levels of structure. Markov needs only $E[X]$. Chebyshev needs the variance. Hoeffding needs independence + bounded support. Each gives an upper bound on $P(X \ge a)$ without committing to a distribution. Used everywhere: PAC learning, A/B testing sample sizes, Monte Carlo error bars, finance risk floors.

βœ“ Intro Β· expand
Independent · Legal