Home > Testing and Tuning Models > Testing Classification Models > Test Metrics for Classifica... > Classification Model Test a... > Classification Model Test V... > Performance
The performance tab shows test results for several common test metrics.
The Measure list allows you to select the measures to display. The default is to display all measures. Two Sort By lists specify sort attribute and sort order. The first list is sort attribute: Measure, Creation Date, or Name (the default). The second list is the sort order: Ascending or Descending. (the default selection).
The performance tab displays the following:
All Measures, the default, shows all of the measures
Cost, if you specified costs or the system calculated costs
For brief descriptions of these measures, see Test Metrics for Classification Models.
The selected measures are displayed as graphs. If you are comparing test results for two or more models, different models have graphs in different colors.
Below the graphs, the Models table supplements the information presented in the graph. You can minimize the table using the splitter line.
The Models table summarizes the data in the histograms:
Name, the name of the model along with color of the model in the graphs
Predictive Confidence percent
Overall Accuracy percent
Average Accuracy percent
Cost, if you specified cost (costs are calculated by Data Miner for Decision Tree)
Algorithm (used to build the model)
Creation Date
By default, results for the selected model are displayed. To change the list of models, click
and deselect any models for which you do not want to see results. If you deselect a model, both the histogram and the summary information are removed.