# Explained sum of squares

34,114pages on
this wiki

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

In statistics, an explained sum of squares (ESS) is the sum of squared predicted values in a standard regression model (for example $y_{i}=a+bx_{i}+\epsilon_{i}$), where $y_{i}$ is the response variable, $x_{i}$ is the explanatory variable, $a$ and $b$ are coefficients, $i$ indexes the observations from $1$ to $n$, and $\epsilon_{i}$ is the error term.

If $\hat{a}$ and $\hat{b}$ are the estimated coefficients, then

$\hat{y_{i}}=\hat{a}+\hat{b}x_{i}$

is the predicted variable. The ESS is the sum of the squares of the differences of the predicted values and the grand mean:

$\sum_{i=1}^{n}\left(\hat{y}_{i}-\bar{y}\right)^2$

In general: total sum of squares = explained sum of squares + residual sum of squares.

 This page uses Creative Commons Licensed content from Wikipedia (view authors).