Ad blocker interference detected!
Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers
Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.
Individual differences |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |
| Probability density function|
| Cumulative distribution function|
|Parameters|| shape (real)|
Probability density functionEdit
where is the shape parameter and is the scale parameter of the gamma distribution. (NOTE: this parameterization is what is used in the infobox and the plots.)
Alternatively, the gamma distribution can be parameterized in terms of a shape parameter and an inverse scale parameter , called a rate parameter:
Both parameterizations are common because they are convenient to use in certain situations and fields.
The information entropy is given by:
where is the polygamma function.
If for and then
If , then . Or, more generally, for any it holds that . That is the meaning of θ (or β) being the scale parameter.
Parameter estimation Edit
The likelihood function is
from which we calculate the log-likelihood function
Finding the maximum with respect to by taking the derivative and setting it equal to zero yields the maximum likelihood estimate of the parameter:
Substituting this into the log-likelihood function gives:
Finding the maximum with respect to by taking the derivative and setting it equal to zero yields:
where is the digamma function.
There is no closed-form solution for . The function is numerically very well behaved, so if a numerical solution is desired, it can be found using Newton's method. An initial value of can be found either using the method of moments, or using the approximation:
If we let then is approximately
which is within 1.5% of the correct value.
Generating Gamma random variables Edit
Given the scaling property above, it is enough to generate Gamma variables with as we can later convert to any value of β with simple division.
Using the fact that if , then also , and the method of generating exponential variables, we conclude that if U is uniformly distributed on (0, 1], then . Now, using the "α-addition" property of Gamma distribution, we expand this result:
where are all uniformly distributed on (0, 1 ] and independent.
All that is left now is to generate a variable distributed as for and apply the "α-addition" property once more. This is the most difficult part, however.
We provide an algorithm without proof. It is an instance of the acceptance-rejection method:
- Let m be 1.
- Generate and — independent uniformly distributed on (0, 1] variables.
- If , where , then go to step 4, else go to step 5.
- Let . Go to step 6.
- Let .
- If , then increment m and go to step 2.
- Assume to be the realization of .
Now, to summarize,
where is the integral part of α, ξ has been generating using the algorithm above with (the fractional part of α), and are distributed as explained above and are all independent.
- is an exponential distribution if .
- if for any c > 0 .
- is a gamma distribution if and if the are all independent and share the same parameter .
- is a chi-square distribution if .
- If is an integer, the gamma distribution is an Erlang distribution (so named in honor of A. K. Erlang) and is the probability distribution of the waiting time until the -th "arrival" in a one-dimensional Poisson process with intensity .
- then if , where is the inverse-gamma distribution.
- is a beta distribution if < and and are also independent.
- is a Maxwell-Boltzmann distribution if .
- is a normal distribution as where .
- The real vector follows a Dirichlet distribution if are independent, and .
- R. V. Hogg and A. T. Craig. Introduction to Mathematical Statistics, 4th edition. New York: Macmillan, 1978. (See Section 3.3.)
See also Edit
|This page uses Creative Commons Licensed content from Wikipedia (view authors).|