Confusion Matrix
Predicted Class
(Training) Predicted Class (Test)
Actual Class Count Yes No % Correct Yes No % Correct
Yes (Event) 139 124 15 89.21 110 29 79.14
No 164 8 156 95.12 24 140 85.37
All 303 132 171 92.41 134 169 82.51
Assign a row to the event class if the event probability for the row exceeds
0.5.
Statistics Training (%) Test (%)
True positive rate (sensitivity or power) 89.21 79.14
False positive rate (type I error) 4.88 14.63
False negative rate (type II error) 10.79 20.86
True negative rate (specificity) 95.12 85.37
In this example, the total number of Yes events is 139, and the
total number of No is 164.
In the training data,
the number of predicted events (Yes) is 124, which is 89.21% correct.
In the training data,
the number of predicted nonevents (No) is 156, which is 95.12% correct.
In the test data, the
number of predicted events (Yes) is 110, which is 79.14% correct.
In the test data, the
number of predicted nonevents (No) is 140, which is 85.37% correct.
Overall, the %Correct for the Training data is 92.41% and 82.51%
for the Test data. Use the results for the test data to evaluate the prediction
accuracy for new observations.
A low value for % Correct is usually due to a deficient fitted model,
which can be caused by several different reasons. If the % Correct is very low,
consider whether class weights may help. Class weights can help provide a more
accurate model when observations from one class weigh more than observations
from a different class. Also, you can change the probability that is required
for a case to be classified as the event.
By using this site you agree to the use of cookies for analytics and personalized content. Read our policy