$E[X+Y] = E[X] + E[Y]$ holds even when $X$ and $Y$ are dependent. This one identity collapses 70% of expectation problems.
Method · Expectation Linearity
Intro
Linearity of expectation says $E[X + Y] = E[X] + E[Y]$ for any two random variables — independent or not, correlated or not. That’s the most-leveraged identity in interview probability, because it lets you trade a complicated joint distribution for a sum of trivial marginals. The recipe: decompose the quantity you want into a sum of simple pieces (often 0/1 indicators), compute each piece’s expectation in isolation, add. Independence is never checked because it’s never needed.
β Intro Β· expand
Try first (productive failure)
Before the worked example: spend 60 seconds taking your best shot at this.
A guess is fine β being briefly wrong about a problem makes the explanation
land harder when you read it. This appears once per tutorial; skip
if you already know the trick.
60s
β Try first Β· expand
Worked example
A permutation of $\{1, 2, 3, 4, 5\}$ is generated uniformly at random. A fixed point is an index $i$ such that the $i$-th element of the permutation equals $i$. What is the expected number of fixed points?
β Worked example Β· expand
Practice 1 of 3Type a fraction, decimal, or expression β mathjs parses it.
β Practice Β· expand
Reflection
Linearity works even when the indicators are dependent — in the HH-pairs problem, adjacent indicators share a coin flip and are clearly correlated. In your own words, why does that not break the argument? And what’s the cue in a problem statement that tells you to decompose into indicators rather than try to compute the full distribution of the count?