Method table for Analyze Variability

The method table displays whether you used the least squares or maximum likelihood method.
Least squares method
Least squares estimates are calculated by fitting a regression line to the points from a data set that has the minimal sum of the deviations squared (least square error).
Maximum likelihood method
The likelihood function indicates how likely the observed sample is a function of possible parameter values. Therefore, maximizing the likelihood function determines the parameters that are most likely to produce the observed data. From a statistical point of view, MLE is generally recommended for large samples because it is versatile, applicable to most models and different types of data, and produces the most precise estimates.

Comparison of methods

In many cases, the differences between the LS and MLE results are minor, and the methods can be used interchangeably. You may want to run both methods and see whether the results confirm one another. If the results differ, you may want to determine why. Otherwise, you may want to use the more conservative estimates or consider the advantages of both approaches and make a choice for your problem.

  LSE MLE
Biased No Yes for small samples, but decreases as sample size increases
Estimate variance Larger Smaller
P-values More precise Less precise
Coefficients Less precise More precise
Censored data Less reliable and unusable in extreme cases More reliable even in extreme cases

Based on their relative strengths, LSE and MLE can be used together for different parts of the analysis. Use LSE's more precise p-values to select the terms to include in the model and use MLE to estimate the final coefficients.

    By using this site you agree to the use of cookies for analytics and personalized content.  Read our policy