Find definitions and interpretations for every statistic in the
Confusion matrix.

The Confusion matrix shows how well the tree separates the classes correctly
using these metrics:

- True positive rate (TPR) — the probability that an event case is predicted correctly
- False positive rate (FPR) — the probability that a nonevent case is predicted incorrectly
- False negative rate (FNR) — the probability that an event case is predicted incorrectly
- True negative rate (TNR) — the probability that a nonevent case is predicted correctly

Optimal Tree: 7 terminal nodes, 6 internal nodes
Max Tree: 21 terminal nodes, 20 internal nodes
Confusion Matrix
Predicted Class
(Training) Predicted Class (Test)
Actual Class Count Yes No %Correct Yes No %Correct
Yes (Event) 139 117 22 84.2 105 34 75.5
No 164 22 142 86.6 24 140 85.4
All 303 139 164 85.5 129 174 80.9
Statistics Training (%) Test (%)
True positive rate (sensitivity or power) 84.2 75.5
False positive rate (type I error) 13.4 14.6
False negative rate (type II error) 15.8 24.5
True negative rate (specificity) 86.6 85.4

In this example, the total number of Yes events is 139, and the
total number of No events is 164.

- In the training data, the number of predicted Yes events is 117, which is 84.2% correct.
- In the training data, the number of predicted No events is 142, which is 86.6% correct.
- In the test data, the number of predicted Yes events is 105, which is 75.5% correct.
- In the test data, the number of predicted No events is 140, which is 80.9% correct.

- True positive rate (TPR) — 84.2% for the Training data and 75.5% for the Test data.
- False positive rate (FPR) — 13.4% for the Training data and 14.6% for the Test data.