🏠
Guest Not signed in

Maximum likelihood: write it, log it, differentiate, solve

Write the likelihood, take the log, differentiate, solve. The estimator that maximises the data's plausibility is consistent and asymptotically efficient.

Method · MLE
Prereqs: Bayes Rule
Intro

Maximum likelihood is the workhorse estimator: given an iid sample from a parametric family $f(x; \theta)$, the MLE is the parameter value that makes the observed data most probable. The recipe is always the same — write the likelihood, take logs, differentiate, set to zero, solve. The common families collapse to one-line estimators: Bernoulli gives sample proportion, Poisson gives count-over-exposure, Exponential gives reciprocal-of-sample-mean. Recognize the family, write down the answer.

βœ“ Intro Β· expand
Independent · Legal