Confusion matrix for CART® Classification

Find definitions and interpretations for every statistic in the Confusion matrix.
The Confusion matrix shows how well the tree separates the classes correctly using these metrics:
  • True positive rate (TPR) — the probability that an event case is predicted correctly
  • False positive rate (FPR) — the probability that a nonevent case is predicted incorrectly
  • False negative rate (FNR) — the probability that an event case is predicted incorrectly
  • True negative rate (TNR) — the probability that a nonevent case is predicted correctly

Interpretation

Confusion Matrix



Predicted Class
(Training)





Predicted Class (Test)
Actual ClassCountYesNo% CorrectYesNo% Correct
Yes (Event)1391172284.21053475.5
No1642214286.62414085.4
All30313916485.512917480.9
StatisticsTraining (%)Test (%)
True positive rate (sensitivity or power)84.275.5
False positive rate (type I error)13.414.6
False negative rate (type II error)15.824.5
True negative rate (specificity)86.685.4

In this example, the total number of Yes events is 139, and the total number of No events is 164.
  • In the training data, the number of predicted Yes events is 117, which is 84.2% correct.
  • In the training data, the number of predicted No events is 142, which is 86.6% correct.
  • In the test data, the number of predicted Yes events is 105, which is 75.5% correct.
  • In the test data, the number of predicted No events is 140, which is 80.9% correct.
Overall, the %Correct for the Training data is 85.5% and 80.9% for the Test data.
  • True positive rate (TPR) — 84.2% for the Training data and 75.5% for the Test data.
  • False positive rate (FPR) — 13.4% for the Training data and 14.6% for the Test data.