Stepwise removes and adds terms to the model for the purpose of identifying a useful subset of the terms. If you choose a stepwise procedure, the terms that you specify in the Model dialog box are candidates for the final model. For more information, go to Using stepwise regression and best subsets regression.
With cross-validation, the procedure repeats forward selection on each fold. The procedure evaluates all the folds at each step and identifies the step with the best k-fold stepwise R^{2} value. The last part of the procedure is to perform forward selection on the full dataset, stopping at the best step from the selections on the folds.
For both types of validation, the procedure stops under the same conditions as the forward information criteria procedure.
The terms that are included in the final model can depend on hierarchy restrictions for models. For more information, see the topic on Hierarchy below.
Specify which information criterion to use in forward selection.
Both AICc and BIC assess the likelihood of the model and then apply a penalty for adding terms to the model. The penalty reduces the tendency to overfit the model to the sample data. This reduction can yield a model that performs better in general.
As a general guideline, when the number of parameters is small relative to the sample size, BIC has a larger penalty for the addition of each parameter than AICc. In these cases, the model that minimizes BIC tends to be smaller than the model that minimizes AICc.
In some common cases, such as screening designs, the number of parameters is usually large relative to the sample size. In these cases, the model that minimizes AICc tends to be smaller than the model that minimizes BIC. For example, for a 13-run definitive screening design, the model that minimizes AICc will tend to be smaller than the model that minimizes BIC among the set of models with 6 or more parameters.
For more information on AICc and BIC, see Burnham and Anderson.^{1}
Validation settings are also in the Validation subdialog box. If you change the settings, Minitab automatically updates the settings in both places.
When you select Forward selection with validation, choose the validation method to test your model. Usually, with smaller samples, the K-fold cross-validation method is appropriate. With larger samples, you can divide the data into a training data set and a test data set.
Complete the following steps to use K-fold cross validation.
Complete the following steps to divide the data into a training data set and a test data set.
You can determine how Minitab enforces model hierarchy during a stepwise procedure. The Hierarchy button is disabled if you specify a non-hierarchical model in the Model dialog box.
In a hierarchical model, all lower-order terms that comprise the higher-order terms also appear in the model. For example, a model that includes the interaction term A*B*C is hierarchical if it includes these terms: A, B, C, A*B, A*C, and B*C.
Models can be non-hierarchical. Generally, you can remove lower order terms if they are insignificant, unless subject area knowledge suggests that you include them. Models that contain too many terms can be relatively imprecise and can reduce the ability to predict the values of new observations.
When you choose Forward selection with validation, display a plot of the training and validation R^{2} values for each step in the forward selection. Typically, you use the plot to determine whether simpler models have similar validation values.