🏠
Guest Not signed in
Sign up / Log in
Reference card · 30 facts

Quant Interview Cheat Sheet

The 30 facts every quant candidate should know cold — the ones that turn calculator-grade problems into 30-second mental arithmetic. Bookmark this; revisit the night before any quant interview.

3 worked examples — the cheat sheet in action

Birthday paradox in a 50-person room

#21 Birthday#16 Pascal

Eyeball it: $n > 23 \Rightarrow P > 0.5$. For $n = 50$, the count of pairs is $\binom{50}{2} = 1225$.

$P(\text{some collision}) = 1 - \left(\frac{364}{365}\right)^{1225} \approx 1 - e^{-1225/365} \approx 1 - e^{-3.36} \approx 0.965$.

So about 97%. The cheat sheet lets you skip the calculator entirely — small-x exponential approximation from fact #25 plus the $1/e$ memo from fact #1.

Expected fixed points in a random permutation of 8

#15 Poisson#1 1/e

Linearity: $E[X] = \sum_{i=1}^{n} P(\text{card } i \text{ fixed}) = n \cdot \frac{1}{n} = 1$.

Variance also equals 1; for any moderate $n$ the count converges fast to $\text{Poisson}(1)$. So $P(\text{no fixed points}) \approx 1/e \approx 0.368$ — which is the derangement formula from fact #23.

The cheat sheet collapses three identities (linearity, Poisson limit, $1/e$) into one second of recognition.

Fair price for "$1 per head until first tail"

#5 Geometric series#11 Binomial

Probability of reaching at least $k$ heads is $(1/2)^k$. Each contributes $\$1$, so by linearity:

$E[\text{payoff}] = \sum_{k=1}^{\infty} \left(\tfrac{1}{2}\right)^k = \frac{1/2}{1 - 1/2} = 1.$

Fair price: $1. Fact #5 (geometric series sum) + fact #11 (binomial mean = $np$, here $n$-flip Bernoullis cascade) collapses an infinite-state EV (Expected Value) into one line.

Constants

4 facts
  1. 1
    $e \approx 2.72,\ \ 1/e \approx 0.368,\ \ e^2 \approx 7.39$
    Anchors derangements, optimal-stopping, and the limit $(1 + 1/n)^n$. Without this you can't eyeball $1/e$ problems.
    → Exponential
  2. 2
    $\pi \approx 3.14,\ \ \pi/4 \approx 0.785,\ \ \sqrt{\pi} \approx 1.77$
    Buffon's-needle, Gaussian normalisation ($\int e^{-x^2}dx = \sqrt{\pi}$), unit-circle Monte Carlo.
    → Geometric prob.
  3. 3
    $\ln 2 \approx 0.693,\ \ \log_{10} 2 \approx 0.301,\ \ \ln 10 \approx 2.30$
    Rule of 70 doubling-time, decibels, log-base conversions. Lets you turn $2^n$ questions into base-10.
    → Rule of 70
  4. 4
    $\sqrt{2} \approx 1.414,\ \ \sqrt{3} \approx 1.732,\ \ \sqrt{5} \approx 2.236$
    Golden ratio $\phi = (1+\sqrt{5})/2$, RMS (Root Mean Square) spreads, hypotenuse approximations, vol scaling $\sigma\sqrt{T}$.
    → Hypotenuse

Series identities

6 facts
  1. 5
    Geometric: $\displaystyle\sum_{n=0}^{\infty} r^n = \frac{1}{1-r}$ for $|r| < 1$
    Every "expected payoff under repeated trial" problem collapses to this. Coin-flip until tail = $1$ via $r = 1/2$.
    → Geometric dist.
  2. 6
    Weighted: $\displaystyle\sum_{n=1}^{\infty} n \cdot r^n = \frac{r}{(1-r)^2}$
    Mean of a geometric distribution drops out instantly; used in expected-stopping-time for renewal processes.
    → Linearity
  3. 7
    Basel: $\displaystyle\sum_{n=1}^{\infty} \frac{1}{n^2} = \frac{\pi^2}{6} \approx 1.645$
    Variance of long-tail distributions, $\zeta(2)$ in number theory, convergence checks for $1/n^p$.
    no tutorial yet
  4. 8
    $\displaystyle\sum_{n=0}^{\infty} \frac{1}{n!} = e;\ \ \sum_{n=0}^{\infty} \frac{(-1)^n}{n!} = \frac{1}{e}$
    Derangement count $!n \approx n!/e$; Poisson PMF (Probability Mass Function) normalises via this. Rosetta stone for $e$ in combinatorics.
    → Inclusion-exclusion
  5. 9
    Harmonic: $H_n = \displaystyle\sum_{k=1}^{n} \frac{1}{k} \approx \ln n + 0.577$
    Coupon-collector: $E[\text{trials}] = nH_n$. Expected number of records in a random permutation.
    → Linearity
  6. 10
    Telescoping: $\dfrac{1}{n(n+1)} = \dfrac{1}{n} - \dfrac{1}{n+1}$
    Closes infinite sums in one line. Often the trick for "expected absorbing time" on cleanly-indexed chains.
    → Markov chains

Distributions

5 facts
  1. 11
    Binomial$(n, p)$: mean $np$, variance $np(1-p)$
    "How many heads in 100 flips?" Mean = 50, SD (Standard Deviation) = 5. Anchors every sample-proportion sanity check.
    → Binomial
  2. 12
    Geometric$(p)$: mean $1/p$, variance $(1-p)/p^2$
    "Trials until first success" = $1/p$. The $1/p$ form is the most-used identity in interview rooms.
    → Geometric
  3. 13
    Exponential rate $\lambda$: mean $1/\lambda$, variance $1/\lambda^2$
    Memoryless waiting times; minimum of $n$ exponentials is exponential with rate $\sum \lambda_i$.
    → Exponential
  4. 14
    Normal 68-95-99.7; $\Phi(1.96) = 0.975$
    Two-sided 95% interval; basis for every CI (Confidence Interval) and z-test you'll ever defend in an interview.
    → Normal
  5. 15
    Poisson$(\lambda)$: mean $\lambda$, variance $\lambda$
    Mean = variance is the diagnostic. Rare-event counts (defects, arrivals) all approximate to this.
    → Poisson approx.

Combinatorics

4 facts
  1. 16
    Pascal's triangle through row 7: row 7 = $1, 7, 21, 35, 35, 21, 7, 1$
    Recognise $\binom{n}{k}$ on sight up to $n=7$. Cuts setup time on path-counting and binomial-PMF questions.
    → Binomial
  2. 17
    $n!$ for $n \leq 6$ ends at $720$; recognise $5040 = 7!$
    $1, 2, 6, 24, 120, 720, 5040$. Spot factorial growth in counts before doing any arithmetic.
    no tutorial yet
  3. 18
    Stirling: $n! \approx \left(\dfrac{n}{e}\right)^n \sqrt{2\pi n}$
    Gives an order-of-magnitude on factorials too big for direct compute. $20! \approx 2.4 \times 10^{18}$.
    no tutorial yet
  4. 19
    $2^n$ through $n=20$: $2^{10} = 1024$, $2^{20} \approx 10^6$
    Doubling intuition for $n$-bit search spaces, binary counters, "how many subsets" questions.
    no tutorial yet

Probability staples

5 facts
  1. 20
    Bayes: $P(A \mid B) = \dfrac{P(B \mid A)\,P(A)}{P(B)}$
    "Test is positive — what's $P(\text{disease})$?" Must be reflexive. Rare-event diagnostics, signal detection.
    → Bayes
  2. 21
    Birthday paradox: $\sim 50\%$ collision at 23 people
    Lets you eyeball collision probabilities without computing $\binom{n}{2}/365$ by hand. Hash collisions, too.
    → Complement
  3. 22
    Secretary problem: optimal stop at $n/e$, success $P \approx 1/e$
    "Best of $n$ candidates seen sequentially" — skip first $\sim 37\%$, then take first record. Online-decision template.
    → Optimal stopping
  4. 23
    Derangement count: $!n = \mathrm{round}(n!/e)$
    Number of permutations with zero fixed points. "Hat-check" problem — $P(\text{no match}) \to 1/e$ fast.
    → Inclusion-exclusion
  5. 24
    Gambler's ruin: $P(\text{M wins}) = \dfrac{1 - (q/p)^M}{1 - (q/p)^{M+N}}$
    Asymmetric random walk on $[0, M+N]$; fair-coin limit is $M/(M+N)$. Drawdown / bankruptcy template.
    → Random walk

Calculus shortcuts

6 facts
  1. 25
    Taylor: $e^x \approx 1 + x + \dfrac{x^2}{2}$ for small $x$
    Linearises exponentials when $|x| \ll 1$. Continuous-compounding rate effects, small-vol limits.
    no tutorial yet
  2. 26
    $\ln(1+x) \approx x - \dfrac{x^2}{2}$ for small $x$
    Log-returns $\approx$ simple returns at small scales. Bridges $\log$ and percentage worlds in finance.
    no tutorial yet
  3. 27
    $(1+x)^n \approx 1 + nx$ for small $x$
    Bond-price duration, small-perturbation analysis, first-order Greeks. Always your first-order check.
    no tutorial yet
  4. 28
    Integration by parts: $\displaystyle\int u\,dv = uv - \int v\,du$
    Closes most "$\int x \cdot f(x)\,dx$" integrals that show up in expectation-tail computations.
    no tutorial yet
  5. 29
    Z-score: $z = \dfrac{X - \mu}{\sigma}$ standardises any $X$
    Translates raw values to standard-normal quantiles — the bridge from data to $\Phi(z)$ tables.
    → Normal
  6. 30
    Poisson $\to$ Binomial limit: $n \to \infty,\ p \to 0,\ np = \lambda$
    Replaces awkward $\binom{n}{k} p^k (1-p)^{n-k}$ with $\lambda^k e^{-\lambda}/k!$. Rare-event collapse.
    → Poisson approx.
Independent · Legal