A nonparametric test is a hypothesis test that does not require the population's distribution to be characterized by certain parameters. For example, many hypothesis tests rely on the assumption that the population follows a normal distribution with parameters μ and σ. Nonparametric tests do not have this assumption, so they are useful when your data are strongly nonnormal and resistant to transformation.
In parametric statistics, we assume that samples are drawn from fully specified distributions characterized by one or more unknown parameters we want to make inference about. In a nonparametric method, we assume that the parent distribution of the sample is unspecified and we are often interested in making inference about the center of the distribution. For examples, many tests in parametric statics such as the 1-sample t-test are derived under the assumption that the data come from normal population with unknown mean. In a nonparametric study the normality assumption is removed.
When a choice exists between using a parametric or a nonparametric procedure, and you are relatively certain that the assumptions for the parametric procedure are satisfied, then use the parametric procedure. You may also be able to use the parametric procedure when the population is not normally distributed if the sample size is adequately large.
The following is a list of the nonparametric tests, and their parametric alternatives.
Nonparametric test | Alternative parametric test |
---|---|
1-sample sign test | 1-sample Z-test, 1-sample t-test |
1-sample Wilcoxon test | 1-sample Z-test, 1-sample t-test |
Mann-Whitney test | 2-sample t-test |
Kruskal-Wallis test | One-way ANOVA |
Mood's Median test | One-way ANOVA |
Friedman test | Two-way ANOVA |