Home > Testing and Tuning Models > Testing Classification Models > Test Metrics for Classifica... > Performance > Predictive Confidence
Predictive confidence provides an estimate of the overall goodness of the model.
Predictive confidence is a number between 0 and 1. Data Miner displays predictive confidence as a percent; For example, Predictive Confidence of 59 means Predictive Confidence of 59% (0.59).
Predictive confidence indicates how much better the predictions made by the tested model are than predictions made by a naive model. The naive model always predicts the mean for numerical targets and the mode for categorical targets.
The following formula defines Predictive Confidence:
Predictive Confidence = MAX((1-((error of model)/(error of naive model))), 0)
If predictive confidence is 0, the model's predictions are no better than predictions made using the naive model. If predictive confidence is 1, the predictions are perfect. If predictive confidence is 0.5, the model has cut by 50% the error of a naive model.