For a number p in the closed interval [0,1], the inverse cumulative distribution function (ICDF) of a random variable X determines, where possible, a value x such that the probability of X ≤ x is greater than or equal to p.
The ICDF is the value that is associated with an area under the probability density function. The ICDF is the reverse of the cumulative distribution function (CDF), which is the area that is associated with a value.
For all continuous distributions, the ICDF exists and is unique if 0 < p < 1.
The binomial distribution is used to represent the number of events that occurs within n independent trials. Possible values are integers from zero to n.
mean = np
variance = np(1 – p)
The probability mass function (PMF) is:
Where equals .
In general, you can calculate k! as
Term | Description |
---|---|
n | number of trials |
x | number of events |
p | event probability |
If X has a standard normal distribution, X^{2} has a chi-square distribution with one degree of freedom, allowing it to be a commonly used sampling distribution.
The sum of n independent X^{2} variables (where X has a standard normal distribution) has a chi-square distribution with n degrees of freedom. The shape of the chi-square distribution depends on the number of degrees of freedom.
The probability density function (PDF) is:
mean = v
variance = 2v
Term | Description |
---|---|
ν | degrees of freedom |
Γ | gamma function |
e | base of the natural logarithm |
A discrete distribution is one that you define yourself. For example, suppose you are interested in a distribution made up of three values −1, 0, 1, with probabilities of 0.2, 0.5, and 0.3, respectively. If you enter the values into columns of a worksheet, then you can use these columns to generate random data or to calculate probabilities.
Value | Prob |
---|---|
−1 | 0.2 |
0 | 0.5 |
1 | 0.3 |
The exponential distribution can be used to model time between failures, such as when units have a constant, instantaneous rate of failure (hazard function). The exponential distribution is a special case of the Weibull distribution and the gamma distribution.
The probability density function (PDF) is:
The cumulative distribution function (CDF) is:
mean = θ + λ
variance = θ^{2}
Term | Description |
---|---|
θ | scale parameter |
λ | threshold parameter |
exp | base of the natural logarithm |
Some references use 1 / θ for a parameter.
The F-distribution is also known as the variance-ratio distribution and has two types of degrees of freedom: numerator degrees of freedom and denominator degrees of freedom. It is the distribution of the ratio of two independent random variables with chi-square distributions, each divided by its degrees of freedom.
The probability density function (PDF) is:
Term | Description |
---|---|
Γ | gamma function |
u | numerator degrees of freedom |
v | denominator degrees of freedom |
The discrete geometric distribution applies to a sequence of independent Bernoulli experiments with an event of interest that has probability p.
If the random variable X is the total number of trials necessary to produce one event with probability p, then the probability mass function (PMF) of X is given by:
and X exhibits the following properties:
If the random variable Y is the number of nonevents that occur before the first event (with probability p) is observed, then the probability mass function (PMF) of Y is given by:
and Y exhibits the following properties:
Term | Description |
---|---|
X | number of trials to produce one event, Y + 1 |
Y | number of nonevents that occur before the first event |
p | probability that an event occurs on each trial |
The integer distribution is a discrete uniform distribution on a set of integers. Each integer has equal probability of occurring.
A variable x has a lognormal distribution if log(x – λ ) has a normal distribution.
The probability density function (PDF) is:
The cumulative distribution function (CDF) is:
Term | Description |
---|---|
μ | location parameter |
σ | scale parameter |
λ | threshold parameter |
π | Pi (~3.142) |
The normal distribution (also called Gaussian distribution) is the most used statistical distribution because of the many physical, biological, and social processes that it can model.
The probability density function (PDF) is:
The cumulative distribution function (CDF) is:
mean = μ
variance = σ ^{2}
standard deviation = σ
Term | Description |
---|---|
exp | base of the natural logarithm |
π | Pi (~3.142) |
The Poisson distribution is a discrete distribution that models the number of events based on a constant rate of occurrence. The Poisson distribution can be used as an approximation to the binomial when the number of independent trials is large and the probability of success is small.
The probability mass function (PMF) is:
mean = λ
variance = λ
Term | Description |
---|---|
e | base of the natural logarithm |
mean = 0, when ν > 0
Term | Description |
---|---|
Γ | gamma function |
v | degrees of freedom |
π | Pi (~3.142) |
The uniform distribution characterizes data over an interval uniformly, with a as the smallest value and b as the largest value.
The probability density function (PDF) is:
Term | Description |
---|---|
a | lower endpoint |
b | upper endpoint |
The Weibull distribution is useful to model product failure times.
The probability density function (PDF) is:
The cumulative distribution function (CDF) is:
Term | Description |
---|---|
α | scale parameter |
β | shape parameter, when β = 1 the Weibull PDF is the same as the exponential PDF |
λ | threshold parameter |
Γ | gamma function |
exp | base of the natural logarithm |
Some references use 1/α as a parameter.