What is Nonlinear Regression?

Nonlinear regression generates an equation to describe the nonlinear relationship between a continuous response variable and one or more predictor variables, and predicts new observations. Use nonlinear regression instead of ordinary least squares regression when you cannot adequately model the relationship with linear parameters. Parameters are linear when each term in the model is additive and contains only one parameter that multiplies the term.

Comparison of nonlinear and linear regression

For a basic understanding of nonlinear regression, it is important to understand the similarities and differences between it and linear regression.

Similarities

Both analyses:
  • Mathematically describe the relationship between a response variable and one or more predictor variables.
  • Can model a curved relationship.
  • Minimize the sum of squares of the residual error (SSE).
  • Have the same assumptions that you can check using residual plots.

Differences

The fundamental difference between linear and nonlinear regression, and the basis for the analyses' names, are the acceptable functional forms of the model. Specifically, linear regression requires linear parameters while nonlinear does not. Use nonlinear regression instead of linear regression when you cannot adequately model the relationship with linear parameters.

A linear regression function must be linear in the parameters, which constrains the equation to one basic form. Parameters are linear when each term in the model is additive and contains only one parameter that multiplies the term:

Response = constant + parameter * predictor + ... + parameter * predictor

or y = βo + β1X1 + β2X2 + ... + βkXk

However, a nonlinear equation can take many different forms. In fact, because there are an infinite number of possibilities, you must specify the expectation function Minitab uses to perform nonlinear regression. These examples illustrate the variability (θ 's represent the parameters):
  • y = θX (Convex 2, 1 parameter, 1 predictor)
  • y = θ1 * X1 / ( θ2 + X1 ) (Michaelis-Menten equation, 2 parameters, 1 predictor)
  • y = θ1 - θ2 * ( ln ( X1 + θ3 ) - ln ( X2 )) (Nernst equation, 3 parameters, 2 predictors)

Your choice for the expectation function often depends on previous knowledge about the response curve's shape or the behavior of physical and chemical properties in the system. Potential nonlinear shapes include concave, convex, exponential growth or decay, sigmoidal (S), and asymptotic curves. You must specify the function that satisfies both the requirements of your previous knowledge and the nonlinear regression assumptions.

While the flexibility to specify many different expectation functions is very powerful, it can also require great effort to determine the function that provides the optimal fit for your data. This often requires additional research, subject area knowledge, and trial and error analyses. In addition, for nonlinear equations, determining the effect each predictor has on the response can be less intuitive than it is for linear equations.

Nonlinear regression uses a different procedure than linear regression to minimize the sum of squares of the residual error (SSE).