Find definitions and interpretations for every statistic in the
Confusion matrix.
The Confusion matrix shows how well the tree separates the classes correctly
using these metrics:
True positive rate (TPR) — the
probability that an event case is predicted correctly
False positive rate (FPR) —
the probability that a nonevent case is predicted incorrectly
False negative rate (FNR) —
the probability that an event case is predicted incorrectly
True negative rate (TNR) — the
probability that a nonevent case is predicted correctly
Interpretation
A low value for % Correct is usually due to a deficient fitted model,
which can be caused by several different reasons. If the % Correct is very low,
consider whether class weights may help. Class weights can help provide a more
accurate model when observations from one class weigh more than observations
from a different class. Also, you can change the probability that is required
for a case to be classified as the event.