Home > Testing and Tuning Models > Testing Classification Models > Test Metrics for Classifica... > Classification Model Test a... > Tuning Classification Models > Cost > Testing Regression Models > Regression Model Test Viewers > Compare Regression Test Res... > Compare Test Results
When you compare test results for two or more regression models, each model has a color associated with it; this color is used to display results for that model. For example, if model M1 has purple associated with it, the bar graphs on the Performance tab for M1 is displayed as purple.
By default, test results for all models in the node are compared. If you do not want to compare all test results, click
The Edit Test Results Selection Dialog opens. Deselect results that you do not want to see. Click OK when done.
Compare Test Results opens in a new tab. Results are displayed in two tabs:
Performance tab
The following metrics are compared on the Performance tab:
Predictive Confidence as described in Predictive Confidence for Classification Models
Mean Absolute Error described in Regression Statistics
Mean Predicted Value
By default, test results for all models are compared. To edit the list of models, click
above pane that lists the models to launch the Edit Test Selection (Classification and Regression) dialog.
Residual tab
Displays the residual plot for each of the models. You can compare two plots side by side
By default, test results for all models are compared. To edit the list of models, click
above pane that lists the models to launch the Edit Test Selection (Classification and Regression) dialog.