# Logistic distribution

31,742pages on
this wiki

 Probability density functionStandard logistic PDF Cumulative distribution functionStandard logistic CDF Parameters $\mu\,$ location (real)$s>0\,$ scale (real) Support $x \in (-\infty; +\infty)\!$ pdf $\frac{e^{-(x-\mu)/s}} {s\left(1+e^{-(x-\mu)/s}\right)^2}\!$ cdf $\frac{1}{1+e^{-(x-\mu)/s}}\!$ Mean $\mu\,$ Median $\mu\,$ Mode $\mu\,$ Variance $\frac{\pi^2}{3} s^2\!$ Skewness $0\,$ Kurtosis $6/5\,$ Entropy $\ln(s)+2\,$ mgf $e^{\mu\,t}\,\mathrm{B}(1-s\,t,\;1+s\,t)\!$for $|s\,t|<1\!$, Beta function Char. func. $e^{it\mu}\frac{\pi st}{\sinh(\pi st)}$

In probability theory and statistics, the logistic distribution is a continuous probability distribution. Its cumulative distribution function is the logistic function, which appears in logistic regression and feedforward neural networks. It resembles the normal distribution in shape but has heavier tails (higher kurtosis).

## Specification Edit

### Cumulative distribution function Edit

The logistic distribution receives its name from its cumulative distribution function (cdf), which is an instance of the family of logistic functions:

$F(x; \mu,s) = \frac{1}{1+e^{-(x-\mu)/s}} \!$
$= \frac12 + \frac12 \;\operatorname{tanh}\!\left(\frac{x-\mu}{2\,s}\right).$

In this equation, x is the random variable, μ is the mean, and s is a parameter proportional to the standard deviation.

### Probability density function Edit

The probability density function (pdf) of the logistic distribution is given by:

$f(x; \mu,s) = \frac{e^{-(x-\mu)/s}} {s\left(1+e^{-(x-\mu)/s}\right)^2} \!$
$=\frac{1}{4\,s} \;\operatorname{sech}^2\!\left(\frac{x-\mu}{2\,s}\right).$

Because the pdf can be expressed in terms of the square of the hyperbolic secant function "sech", it is sometimes referred to as the sech-square(d) distribution.[1]

### Quantile function Edit

The inverse cumulative distribution function of the logistic distribution is $F^{-1}$, a generalization of the logit function, defined as follows:

$F^{-1}(p; \mu,s) = \mu + s\,\ln\left(\frac{p}{1-p}\right).$

## Alternative parameterization Edit

An alternative parameterization of the logistic distribution, in terms of the variance σ2, can be derived using the substitution $\sigma^2 = \pi^2\,s^2/3$. This yields the following density function:

$g(x;\mu,\sigma) = f(x;\mu,\sigma\sqrt{3}/\pi) = \frac{\pi}{\sigma\,4\sqrt{3}} \,\operatorname{sech}^2\!\left(\frac{\pi}{2 \sqrt{3}} \,\frac{x-\mu}{\sigma}\right).$

## Applications Edit

The logistic distribution and the S-shaped pattern that results from it have been extensively used in many different areas, including:

• Biology – to describe how species populations grow in competition[2][3]
• Epidemiology – to describe the spreading of epidemics[4]
• Psychology – to describe learning[5]
• Technology – to describe how new technologies diffuse and substitute for each other[6]
• Marketing – the diffusion of new-product sales[7]
• Energy – the diffusion and substitution of primary energy sources,[8] as in the Hubbert curve
• Hydrology - In hydrology the distribution of long duration river discharge and rainfall (e.g. monthly and yearly totals, consisting of the sum of 30 respectively 360 daily values) is often thought to be almost normal according to the central limit theorem.[9] The normal distribution, however, needs a numeric approximation. As the logistic distribution, which can be solved analytically, is similar to the normal distribution, it can be used instead. The blue picture illustrates an example of fitting the logistic distribution to ranked October rainfalls - that are almost normally distributed - and it shows the 90% confidence belt based on the binomial distribution. The rainfall data are represented by plotting positions as part of the cumulative frequency analysis.
• Physics - the cdf of this distribution describes a Fermi gas and more specifically the number of electrons within a metal that can be expected to occupy a given quantum state.[citation needed] Its range is between 0 and 1, reflecting the Pauli exclusion principle. The value is given as a function of the kinetic energy corresponding to that state and is parametrized by the Fermi energy and also the temperature (and Boltzmann constant).[citation needed] By changing the sign in front of the "1" in the denominator, one goes from Fermi–Dirac statistics to Bose–Einstein statistics. In this case, the expected number of particles (bosons) in a given state can exceed unity, which is indeed the case for systems such as lasers.[citation needed]

Both the United States Chess Federation and FIDE have switched their formulas for calculating chess ratings from the normal distribution to the logistic distribution; see Elo rating system.

## Related distributions Edit

• If $X \sim \textrm{Logistic}(\mu,\beta)\,$ then $k X + loc \sim \textrm{Logistic}(k \mu + loc,k \beta)\,$
• Logistic distribution mimics Sech distribution
• If $X \sim U(0,1)\,$ (Uniform distribution (continuous)) then $\mu + \beta \left(\log{(X)} - \log{(1-X)} \right) \sim \textrm{Logistic}(\mu,\beta)\,$
• If $X \sim \mathrm{Exponential}(1)\,$ (Exponential distribution) then $\mu-\beta\log{\tfrac{e^{-X}}{1-e^{-X}}} \sim \mathrm{Logistic}(\mu,\beta)$
• If $X \sim \mathrm{Exponential}(1)\,$ and $Y \sim \mathrm{Exponential}(1)\,$ then $\mu-\beta\log{\tfrac{X}{Y}} \sim \mathrm{Logistic}(\mu,\beta)$
• If $X \sim \mathrm{Gumbel}(\alpha,\beta)\,$ and $Y \sim \mathrm{Gumbel}(\alpha,\beta)\,$ (Gumbel distribution) then $X-Y \sim \mathrm{Logistic}(0,\beta) \,$
• If $X \sim \mathrm{GEV}(\alpha,\beta,0)\,$ and $Y \sim \mathrm{GEV}(\alpha,\beta,0)\,$ (Generalized extreme value distribution) then $X-Y \sim \mathrm{Logistic}(0,\beta) \,$
• If $X \sim \mathrm{Gumbel}(\alpha,\beta)\,$ and $Y \sim \mathrm{GEV}(\alpha,\beta,0)\,$ then $X+Y \sim \mathrm{Logistic}(2 \alpha,\beta) \,$
• If $\log{X} \sim \textrm{Logistic}\,$ then $X \sim \textrm{LogLogistic}\,$ (log-logistic distribution) and $X-a \sim \textrm{ShiftedLogLogistic}\,$ (shifted log-logistic distribution)

## Derivations Edit

### Higher order moments Edit

The n-th order central moment can be expressed in terms of the quantile function:

\begin{align} \operatorname{E}[(X-\mu)^n] &= \int_{-\infty}^\infty (x-\mu)^n dF(x) = \int_0^1 \big(F^{-1}(p)-\mu\big)^n dp \\ &= s^n \int_0^1 \Big[ \ln\!\Big(\frac{p}{1-p}\Big) \Big]^n \, dp. \end{align}

This integral is well-known[10] and can be expressed in terms of Bernoulli numbers:

$\operatorname{E}[(X-\mu)^n] = s^n\pi^n(2^n-2)\cdot|B_n|.$

## Notes Edit

1. Johnson, Kotz & Balakrishnan (1995, p.116).
2. P. F. Verhulst (1845) "Recherches mathématiques sur la loi d'accroissement de la population", Nouveaux Mémoirs de l'Académie Royale des Sciences et des Belles-Lettres de Bruxelles, vol. 18)
3. Lotka, Alfred J. (1925) Elements of Physical Biology, Baltimore, MD: Williams & Wilkins Co..
4. Modis (1992,pp 97-105)
5. Modis (1992, Chapter 2)
6. J. C. Fisher and R. H. Pry (1971) "A Simple Substitution Model of Technological Change", Technological Forecasting & Social Change, vol. 3, no. 1 Template:Pn
7. Modis, Theodore (1998), Conquering Uncertainty, McGraw-Hill, New York (Chapter 1)
8. Cesare Marchetti (1977) "Primary Energy Substitution Models: On the Interaction between Energy and Society", Technological Forecasting & Social Change, vol. 10, Template:Page needed.
9. Ritzema (ed.), H.P. (1994). Frequency and Regression Analysis, 175–224, Chapter 6 in: Drainage Principles and Applications, Publication 16, International Institute for Land Reclamation and Improvement (ILRI), Wageningen, The Netherlands.
10. [1]

## References Edit

• John S. deCani and Robert A. Stine (1986). A note on deriving the information matrix for a logistic distribution. American Statistician 40:220-222.
• N., Balakrishnan (1992). Handbook of the Logistic Distribution, Marcel Dekker, New York.
• Johnson, N. L., Kotz, S., Balakrishnan N. (1995). Continuous Univariate Distributions, Vol. 2, 2nd Ed..
• Modis, Theodore (1992) Predictions: Society's Telltale Signature Reveals the Past and Forecasts the Future, Simon & Schuster, New York. ISBN 0671759175
Probability distributions [[[:Template:Tnavbar-plain-nodiv]]]
Univariate Multivariate
Discrete: BernoullibinomialBoltzmanncompound PoissondegeneratedegreeGauss-Kuzmingeometrichypergeometriclogarithmicnegative binomialparabolic fractalPoissonRademacherSkellamuniformYule-SimonzetaZipfZipf-Mandelbrot Ewensmultinomial
Continuous: BetaBeta primeCauchychi-squareDirac delta functionErlangexponentialexponential powerFfadingFisher's zFisher-TippettGammageneralized extreme valuegeneralized hyperbolicgeneralized inverse GaussianHotelling's T-squarehyperbolic secanthyper-exponentialhypoexponentialinverse chi-squareinverse gaussianinverse gammaKumaraswamyLandauLaplaceLévyLévy skew alpha-stablelogisticlog-normalMaxwell-BoltzmannMaxwell speednormal (Gaussian)ParetoPearsonpolarraised cosineRayleighrelativistic Breit-WignerRiceStudent's ttriangulartype-1 Gumbeltype-2 GumbeluniformVoigtvon MisesWeibullWigner semicircle DirichletKentmatrix normalmultivariate normalvon Mises-FisherWigner quasiWishart
Miscellaneous: Cantorconditionalexponential familyinfinitely divisiblelocation-scale familymarginalmaximum entropy phase-typeposterior priorquasisampling
</center>