AI Features

Model Evaluation Measures (Explained Variance Score, MAE, MSE)

In this lesson we will look at different evaluation measures for Regression Models.

Regression Models Evaluation Metrics

Once we have built a model on the training dataset, it is time to evaluate the model on the test dataset to check how good or bad it is. It will also help us know

  • If the model is overfitting
  • If the model is underfitting
  • If we need to revise our Feature Engineering or Feature Selection techniques.

We use the following measures to assess the performance of a Regression Model.

Explained Variance Score

Explained Variance is one of the key measures in evaluating the Regression Models. In statistics, explained Variation Measures the proportion to which a regression model accounts for the variation (dispersion) of a given data set.

Formula

If y^\hat{y} is the predicted target real valued output, then yy is the corresponding (correct) target real valued output, and VarVar is Variance. Then the explained variance is estimated as follow:

explained_variance(y^,y)=1Var(yy^)Var(y)explained\_variance(\hat{y}, y) = 1 - \frac{Var(y-\hat{y})}{Var(y)} ...

Ask