The sum of the squares errors is a measure of the variance of the measured data from the true mean of the data. The sum of the errors is zero, on the average, since errors can be equally likely positive or negative. read more
The residual sum of squares is a measure of the amount of error remaining between the regression function and the data set. A smaller residual sum of squares figure represents a regression function which explains a greater amount of the data. read more
In statistics, the explained sum of squares (ESS), alternatively known as the model sum of squares or sum of squares due to regression ("SSR" – not to be confused with the residual sum of squares RSS), is a quantity used in describing how well a model, often a regression model, represents the data being modelled. read more
In particular, the explained sum of squares measures how much variation there is in the modelled values and this is compared to the total sum of squares, which measures how much variation there is in the observed data, and to the residual sum of squares, which measures the variation in the modelling errors. read more