Complete the following steps to interpret an ordinal logistic regression model. Key output includes the p-value, the coefficients, the log-likelihood, and the measures of association.

To determine whether the association between the response and each term in the model is statistically significant, compare the p-value for the term to your significance level to assess the null hypothesis. The null hypothesis is that there is no association between the term and the response. Usually, a significance level (denoted as α or alpha) of 0.05 works well. A significance level of 0.05 indicates a 5% risk of concluding that an association exists when there is no actual association.

- P-value ≤ α: The association is statistically significant
- If the p-value is less than or equal to the significance level, you can conclude that there is a statistically significant association between the response variable and the term.
- P-value > α: The association is not statistically significant
- If the p-value is greater than the significance level, you cannot conclude that there is a statistically significant association between the response variable and the term. You may want to refit the model without the term.

For a categorical factor with more than 2 levels, the hypothesis for the coefficient is about whether that level of the factor is different from the reference level for the factor. To assess the statistical significance of the factor, use the test for terms with more than 1 degree of freedom. For more information on how to display this test, go to Select the results to display for Ordinal Logistic Regression.

To determine how well the model fits the data, examine the log-likelihood and the measures of association. Larger values of the log-likelihood indicate a better fit to the data. Because log-likelihood values are negative, the closer to 0, the larger the value. The log-likelihood depends on the sample data, so you cannot use the log-likelihood to compare models from different data sets.

The log-likelihood cannot decrease when you add terms to a model. For example, a model with 5 terms has higher log-likelihood than any of the 4-term models you can make with the same terms. Therefore, log-likelihood is most useful when you compare models of the same size. To make decisions about individual terms, you usually look at the p-values for the term in the different logits.

Larger values for Somers' D, Goodman-Kruskal gamma, and Kendall's tau-a indicate that the model has better predictive ability. Somers' D and Goodman-Kruskal gamma can be between -1 and 1. Kendall's tau-a can be between -2/3 and 2/3. Values close to the maximum indicate the model has good predictive ability. Values close to 0 indicate that the model does not have a predictive relationship with the response. Negative values are rare in practice because that performance is worse than when the model and the response are unrelated.

In this second set of results, the distance and the square of the distance are both predictors. You cannot use the log-likelihood to compare these models because they have different numbers of terms. The measures of association are higher for the second model, which indicates that the second model performs better than the first model.

Response Information
Variable Value Count
Return Appointment Very Likely 19
Somewhat Likely 43
Unlikely 11
Total 73

Logistic Regression Table
Odds 95% CI
Predictor Coef SE Coef Z P Ratio Lower Upper
Const(1) 6.38671 3.06110 2.09 0.037
Const(2) 9.31883 3.15929 2.95 0.003
Distance -1.25608 0.523879 -2.40 0.017 0.28 0.10 0.80
Distance*Distance 0.0495427 0.0214636 2.31 0.021 1.05 1.01 1.10

Test of All Slopes Equal to Zero
DF G P-Value
2 6.066 0.048

Goodness-of-Fit Tests
Method Chi-Square DF P
Pearson 114.903 100 0.146
Deviance 94.779 100 0.629

Measures of Association:
(Between the Response Variable and Predicted Probabilities)
Pairs Number Percent Summary Measures
Concordant 938 62.6 Somers’ D 0.29
Discordant 505 33.7 Goodman-Kruskal Gamma 0.30
Ties 56 3.7 Kendall’s Tau-a 0.16
Total 1499 100.0