# Example of Binary Logistic Regression

A marketing consultant for a cereal company investigates the effectiveness of a TV advertisement for a new cereal product. The consultant shows the advertisement in a specific community for one week. Then the consultant randomly samples adults as they leave a local supermarket to ask whether they saw the advertisements and bought the new cereal. The consultant also asks adults what their annual household income is.

Because the response is binary, the consultant uses binary logistic regression to determine how the advertisement and income are related to whether or not the adults sampled bought the cereal.

1. Open the sample data, CerealPurchase.MTW.
2. Open the Binary Logistic Regression dialog box.
• Mac: Statistics > Regression > Binary Logistic Regression
• PC: STATISTICS > Binary Logistic > Binary Logistic Regression
3. From the drop-down list, select Response in binary response/frequency format.
4. In Response, enter Bought.
5. In Continuous predictors, enter Income.
6. In Categorical predictor, enter ViewAd.
7. Click Options. In Confidence level, enter 90.
8. Click OK.

## Interpret the results

The Deviance table shows which predictors have a statistically significant relationship with the response. The consultant uses a 0.10 significance level and the results indicate that the predictor ViewAd has a statistically significant relationship with the response. Income does not have a statistically significant relationship with the response because the p-value is greater than 0.10. The consultant may want to refit the model without the income variable.

The odds ratio for adults that saw the ad indicates that they are about 3 times more likely to purchase the cereal than adults who have not seen the ad.

The goodness-of-fit tests are all greater than the significance level of 0.05, which indicates that there is not enough evidence to conclude that the model does not fit the data. The deviance R2 value indicates that the model explains approximately 8.8% of the deviance in the response.

 Regression Equations
 P(1) = exp(Y')/(1 + exp(Y'))
 ViewAd No Y' = −2.4148 + 0.02656 Income Yes Y' = −1.2946 + 0.02656 Income
 Response Information
 Variable Value Count Bought 1 22 (Event) 0 49 Total 71
 Deviance Table
 Source DF Adj Dev Adj Mean Chi-Square P-Value Regression 2 7.7412 3.87058 7.74 0.0208 Income 1 2.4005 2.40054 2.40 0.1213 ViewAd 1 4.2686 4.26858 4.27 0.0388 Error 68 80.1551 1.17875 Total 70 87.8963
 Model Summary
 Deviance R-sq Deviance R-sq(adj) AIC 8.81% 6.53% 86.16
 Coefficients
 Term Coef SE Coef 90% CI Z-Value P-Value Constant -2.4148 0.7874 (-3.7099, -1.1197) -3.07 0.0022 Income 0.02656 0.01751 (-0.00224, 0.05535) 1.52 0.1292 ViewAd Yes 1.1203 0.5543 (0.2085, 2.0320) 2.02 0.0433
 Odds Ratios for Continuous Predictor
 Odds Ratio 90% CI Income 1.02691 (0.998, 1.057)
 Odds Ratios for Categorical Predictor
 Level A Level B Odds Ratio 90% CI ViewAd Yes No 3.06566 (1.232, 7.630)
 Odds ratio for level A relative to level B
 Goodness-of-Fit Tests
 Test DF Chi-Square P-Value Deviance 68 80.16 0.1486 Pearson 68 71.96 0.3483 Hosmer-Lemeshow 8 9.71 0.2859
 Fits and Diagnostics for Unusual Observations
 Obs Observed Probability Fit Resid Std Resid 48 1 0.106913 2.11459 2.16 R 50 1 0.138176 1.98959 2.02 R
 R Large residual
By using this site you agree to the use of cookies for analytics and personalized content.  Read our policy