Distribution Fitting Startup Panel and Quick Tab
Ribbon bar. Select the Statistics tab. In the Base group, click Distribution Fitting to display the Distribution Fitting Startup Panel.
Classic menus. From the Statistics menu, select Distribution Fitting to display the Distribution Fitting Startup Panel.
The Startup Panel contains one tab: Quick. Use this module to fit various distributions to your data. See also the Distribution Fitting Index.
See also the Distribution Fitting Index, Overviews, and Example.
- There is a strong tendency for the variable to take a central value;
- Positive and negative deviations from this central value are equally likely;
- The frequency of deviations falls off rapidly as the deviations become larger.
As an underlying mechanism that produces the normal distribution, one may think of an infinite number of independent random (binomial) events that bring about the values of a particular variable. For example, there are probably a nearly infinite number of factors that determine a person's height (thousands of genes, nutrition, diseases, etc.). Thus, height can be expected to be normally distributed in the population.
The normal distribution function is determined by the following formula:
f(x) = 1 / [(2 * p)1/2 * s] * e**{-1/2*[(x-m)/s]2 }, for -∞ < x < ∞
where m is the mean, s is the standard deviation, e is the base of the natural logarithm, sometimes called Euler's e (2.71...), and p is the constant Pi (3.14...).
f(x) = 1/(b-a) a<x<b
= 0 elsewhere
where a<b are constants.
The exponential distribution function is defined as:
f(x) = l*e-lx
0 <= x < ∞
l > 0
where l (Lambda) is an exponential function parameter (an alternative parameterization is scale parameter b = 1/l), and e is the base of the natural logarithm, sometimes called Euler's e (2.71...).
Gamma.Select Gamma to fit the continuous distributions to your data as described here. The probability density function of the exponential distribution has a mode of zero. In many instances, it is known a priori that the mode of the distribution of a particular random variable of interest is not equal to zero (e.g., when modeling the distribution of the lifetimes of a product such as an electric light bulb, or the serving time taken at a ticket booth at a baseball game). In those cases, the Gamma distribution is more appropriate for describing the underlying distribution.
The Gamma distribution is defined as:
f(x) = {1/[bG(c)]}*[x/b]c-1*e-x/b
0 <= x, c > 0
where G (Gamma) is the Gamma function, c is the shape parameter, b is the scale parameter, and e is the base of the natural logarithm, sometimes called Euler's e (2.71...).
Thus, the lognormal distribution is defined as:
where m is the location parameter, s is the scale parameter, and e is the base of the natural logarithm, sometimes called Euler's e (2.71...).
Chi-square. Select Chi-square to fit the continuous distributions to your data as described here. The sum of v independent squared random variables, each distributed following the standard normal distribution, is distributed as Chi-square with v degrees of freedom. This distribution is most frequently used in the modeling of random variables (e.g., representing frequencies) in Statistical applications.
The Chi-square distribution is defined by:
f(x) = {1/[2n/2 * G(n/2)]} * [x(n/2)-1 * e-x/2]
n = 1, 2, ..., 0 < x
where n is the degrees of freedom, e is the base of the natural logarithm, sometimes called Euler's e (2.71...), and G (Gamma) is the Gamma function.
Discrete Distributions. Select this option button in order to fit a discrete distribution to your data. Select from the list of distributions, which are described in detail below.
The binomial distribution is defined as:
f(x) = [n!/(x!*(n-x)!)]*px * qn-x
for x = 0,1,2,...,n
where p is the probability that the respective event will occur, q is equal to 1-p, and n is the maximum number of independent trials.
The poisson distribution is defined as:
f(x) = (lx * e-l) / x!
for x = 0,1,2,..., 0 < l
where l (Lambda) is the expected value of x (the mean), and e is the base of the natural logarithm, sometimes called Euler's e (2.71...).
The geometric distribution is defined as:
f(x) = p*(1-p)x
for x = 1,2,...
where p is the probability that a particular event (e.g., success) will occur.
The Bernoulli distribution is defined as:
f(x) = px * (1-p)1-x
for x Î {0,1}
where p is the probability that a particular event (e.g., success) will occur.