In
statistics, the
Gauss–Markov theorem, named after
Carl Friedrich Gauss and
Andrey Markov, states that in a
linear regression model in which the errors have expectation zero and are
uncorrelated and have equal
variances, the
best linear unbiased estimator (
BLUE) of the coefficients is given by the
ordinary least squares (OLS) estimator. Here "best" means giving the lowest variance of the estimate, as compared to other unbiased, linear estimators. The errors do not need to be
normal, nor do they need to be
independent and identically distributed (only
uncorrelated with mean zero and
homoscedastic with finite variance). The requirement that the estimator be unbiased cannot be dropped, since biased estimators exist with lower variance. See, for example, the
James–Stein estimator (which also drops linearity) or
ridge regression.