# Explained sum of squares

34,202pages on
this wiki

### Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.

In statistics, an explained sum of squares (ESS) is the sum of squared predicted values in a standard regression model (for example $y_{i}=a+bx_{i}+\epsilon_{i}$), where $y_{i}$ is the response variable, $x_{i}$ is the explanatory variable, $a$ and $b$ are coefficients, $i$ indexes the observations from $1$ to $n$, and $\epsilon_{i}$ is the error term.

If $\hat{a}$ and $\hat{b}$ are the estimated coefficients, then

$\hat{y_{i}}=\hat{a}+\hat{b}x_{i}$

is the predicted variable. The ESS is the sum of the squares of the differences of the predicted values and the grand mean:

$\sum_{i=1}^{n}\left(\hat{y}_{i}-\bar{y}\right)^2$

In general: total sum of squares = explained sum of squares + residual sum of squares.