

Line 1: 
Line 1: 

+ 
{{StatsPsy}} 

+ 
{{Distinguish2[[Fstatistics]] as used in [[population genetics]]}} 

+ 


{{Probability distribution  

{{Probability distribution  

name =FisherSnedecor 

name =FisherSnedecor 

type =density 

type =density 
− 
pdf_image =None uploaded yet. 
+ 
pdf_image =[[Image:F distributionPDF.png325px]] 
− 
cdf_image =None uploaded yet. 
+ 
cdf_image =[[Image:F distributionCDF.png325px]] 
− 
parameters =<math>d_1>0,\ d_2>0</math> deg. of freedom 
+ 
parameters =''d''<sub>1</sub>, ''d''<sub>2</sub> > 0 deg. of freedom 
− 
support =<math>x \in [0; +\infty)\!</math> 
+ 
support = ''x'' ∈ [0, +∞) 

pdf =<math>\frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}} 

pdf =<math>\frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}} 

{(d_1\,x+d_2)^{d_1+d_2}}}} 

{(d_1\,x+d_2)^{d_1+d_2}}}} 

{x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)}\!</math> 

{x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)}\!</math> 
− 
cdf =<math>I_{\frac{d_1 x}{d_1 x + d_2}}(d_1/2, d_2/2)\!</math> 
+ 
cdf =<math>I_{\frac{d_1 x}{d_1 x + d_2}} \left(\tfrac{d_1}{2}, \tfrac{d_2}{2} \right)</math> 
− 
mean =<math>\frac{d_2}{d_22}\!</math> for <math>d_2 > 2</math> 
+ 
mean =<math>\frac{d_2}{d_22}\!</math><br /> for ''d''<sub>2</sub> > 2 

median = 

median = 
− 
mode =<math>\frac{d_12}{d_1}\;\frac{d_2}{d_2+2}\!</math> for <math>d_1 > 2</math> 
+ 
mode =<math>\frac{d_12}{d_1}\;\frac{d_2}{d_2+2}\!</math><br /> for ''d''<sub>1</sub> > 2 
− 
variance =<math>\frac{2\,d_2^2\,(d_1+d_22)}{d_1 (d_22)^2 (d_24)}\!</math> for <math>d_2 > 4</math> 
+ 
variance =<math>\frac{2\,d_2^2\,(d_1+d_22)}{d_1 (d_22)^2 (d_24)}\!</math><br /> for ''d''<sub>2</sub> > 4 
− 
skewness =<math>\frac{(2 d_1 + d_2  2) \sqrt{8 (d_24)}}{(d_26) \sqrt{d_1 (d_1 + d_2 2)}}\!</math><br />for <math>d_2 > 6</math> 
+ 
skewness =<math>\frac{(2 d_1 + d_2  2) \sqrt{8 (d_24)}}{(d_26) \sqrt{d_1 (d_1 + d_2 2)}}\!</math><br />for ''d''<sub>2</sub> > 6 
− 
kurtosis =<! to do > 
+ 
kurtosis =''see text'' 

entropy = 

entropy = 
− 
mgf =''see text for raw moments'' 
+ 
mgf =''does not exist, raw moments defined in text and in <ref name=johnson /><ref name=abramowitz /> '' 
− 
char = 
+ 
char =''see text''}} 
− 
}} 
+ 
In [[probability theory]] and [[statistics]], the '''Fdistribution''' is a [[Continuous probability distributioncontinuous]] [[probability distribution]].<ref name=johnson>{{cite book  last = Johnson 
− 
In [[probability theory]] and [[statistics]], the '''''F''distribution''' is a [[continuous probability distributioncontinuous]] [[probability distribution]]. It is also known as '''Snedecor's ''F'' distribution''' or the '''FisherSnedecor distribution''' (after [[Ronald Fisher]] and [[George W. Snedecor]]). 
+ 
 first = Norman Lloyd 

+ 
 coauthors = Samuel Kotz, N. Balakrishnan 

+ 
 title = Continuous Univariate Distributions, Volume 2 (Second Edition, Section 27) 

+ 
 publisher = Wiley 

+ 
 year = 1995 

+ 
 isbn = 0471584940}}</ref><ref name=abramowitz>{{Abramowitz_Stegun_ref26946}}</ref><ref>NIST (2006). [http://www.itl.nist.gov/div898/handbook/eda/section3/eda3665.htm Engineering Statistics Handbook  F Distribution]</ref><ref>{{cite book  last = Mood 

+ 
 first = Alexander 

+ 
 coauthors = Franklin A. Graybill, Duane C. Boes 

+ 
 title = Introduction to the Theory of Statistics (Third Edition, p. 246249) 

+ 
 publisher = McGrawHill 

+ 
 year = 1974 

+ 
 isbn = 0070428646}}</ref> It is also known as '''Snedecor's F distribution''' or the '''FisherSnedecor distribution''' (after [[Ronald FisherR.A. Fisher]] and [[George W. Snedecor]]). The Fdistribution arises frequently as the [[null distribution]] of a [[test statistic]], most notably in the [[analysis of variance]]; see [[Ftest]]. 




− 
A [[random variate]] of the ''F''distribution arises as the ratio of two [[chisquared distributionchisquared]] variates: 
+ 
==Definition== 

+ 
If a [[random variable]] ''X'' has an Fdistribution with parameters ''d''<sub>1</sub> and ''d''<sub>2</sub>, we write ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>). Then the [[probability density function]] for ''X'' is given by 




− 
:<math>\frac{U_1/d_1}{U_2/d_2}</math> 
+ 
:<math> \begin{align} 

+ 
f(x; d_1,d_2) &= \frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}} {(d_1\,x+d_2)^{d_1+d_2}}}} {x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \\ 

+ 
&=\frac{1}{\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \left(\frac{d_1}{d_2}\right)^{\frac{d_1}{2}} x^{\frac{d_1}{2}  1} \left(1+\frac{d_1}{d_2}\,x\right)^{\frac{d_1+d_2}{2}} 

+ 
\end{align}</math> 




− 
where 
+ 
for [[real numberreal]] ''x'' ≥ 0. Here <math>\mathrm{B}</math> is the [[beta function]]. In many applications, the parameters ''d''<sub>1</sub> and ''d''<sub>2</sub> are [[positive integer]]s, but the distribution is welldefined for positive real values of these parameters. 




− 
*''U''<sub>''1''</sub> and ''U''<sub>2</sub> have [[chisquare distribution]]s with ''d''<sub>''1''</sub> and ''d''<sub>2</sub> [[Degrees of freedom (statistics)degrees of freedom]] respectively, and 
+ 
The [[cumulative distribution function]] is 




− 
*''U''<sub>1</sub> and ''U''<sub>2</sub> are [[statistical independenceindependent]] (see [[Cochran's theorem]] for an application). 
+ 
:<math>F(x; d_1,d_2)=I_{\frac{d_1 x}{d_1 x + d_2}}\left (\tfrac{d_1}{2}, \tfrac{d_2}{2} \right) ,</math> 




− 
The ''F''distribution arises frequently as the null distribution of a test statistic, especially in [[likelihoodratio test]]s, perhaps most notably in the [[analysis of variance]]; see [[Ftest]]. 
+ 
where ''I'' is the [[regularized incomplete beta function]]. 




− 
The [[probability density function]] of an ''F''(''d''<sub>1</sub>, ''d''<sub>2</sub>) distributed [[random variable]] is given by 
+ 
The expectation, variance, and other details about the F(''d''<sub>1</sub>, ''d''<sub>2</sub>) are given in the sidebox; for ''d''<sub>2</sub> > 8, the [[excess kurtosis]] is 




− 
:<math> g(x) = \frac{1}{\mathrm{B}(d_1/2, d_2/2)} \; \left(\frac{d_1\,x}{d_1\,x + d_2}\right)^{d_1/2} \; \left(1\frac{d_1\,x}{d_1\,x + d_2}\right)^{d_2/2} \; x^{1} </math> 
+ 
:<math>\gamma_2 = 12\frac{d_1(5d_222)(d_1+d_22)+(d_24)(d_22)^2}{d_1(d_26)(d_28)(d_1+d_22)}</math>. 




− 
for [[real numberreal]] ''x'' ≥ 0, where ''d''<sub>1</sub> and ''d''<sub>2</sub> are [[positive integer]]s, and B is the [[beta function]]. 
+ 
The ''k''th moment of an F(''d''<sub>1</sub>, ''d''<sub>2</sub>) distribution exists and is finite only when 2''k'' < ''d''<sub>2</sub> and it is equal to <ref name=taboga>{{cite web  last1 = Taboga  first1 = Marco  url = http://www.statlect.com/F_distribution.htm  title = The F distribution}}</ref> 




− 
The [[cumulative distribution function]] is 
+ 
:<math>\mu _{X}(k) =\left( \frac{d_{2}}{d_{1}}\right)^{k}\frac{\Gamma \left(\tfrac{d_1}{2}+k\right) }{\Gamma \left(\tfrac{d_1}{2}\right) }\frac{\Gamma \left(\tfrac{d_2}{2}k\right) }{\Gamma \left( \tfrac{d_2}{2}\right) }</math> 




− 
:<math> G(x) = I_{\frac{d_1 x}{d_1 x + d_2}}(d_1/2, d_2/2) </math> 
+ 
The ''F''distribution is a particular parametrization of the [[beta prime distribution]], which is also called the beta distribution of the second kind. 




− 
where ''I'' is the [[regularized incomplete beta function]]. 
+ 
The [[Characteristic function (probability theory)characteristic function]] is listed incorrectly in many standard references (e.g., <ref name=abramowitz />). The correct expression <ref>Phillips, P. C. B. (1982) "The true characteristic function of the F distribution," ''[[Biometrika]]'', 69: 261264 {{jstor2335882}}</ref> is 




− 
== Generalization == 
+ 
:<math>\varphi^F_{d_1, d_2}(s) = \frac{\Gamma(\frac{d_1+d_2}{2})}{\Gamma(\tfrac{d_2}{2})} U \! \left(\frac{d_1}{2},1\frac{d_2}{2},\frac{d_2}{d_1} \imath s \right)</math> 

+ 


+ 
where ''U''(''a'', ''b'', ''z'') is the [[confluent hypergeometric function]] of the second kind. 

+ 


+ 
==Characterization== 

+ 
A [[random variate]] of the Fdistribution with parameters ''d''<sub>1</sub> and ''d''<sub>2</sub> arises as the ratio of two appropriately scaled [[chisquared distributionchisquared]] variates:<ref>M.H. DeGroot (1986), ''Probability and Statistics'' (2nd Ed), AddisonWesley. ISBN 020111366X, p. 500</ref> 

+ 


+ 
:<math>X = \frac{U_1/d_1}{U_2/d_2}</math> 

+ 


+ 
where 

+ 


+ 
*''U''<sub>1</sub> and ''U''<sub>2</sub> have [[chisquared distribution]]s with ''d''<sub>1</sub> and ''d''<sub>2</sub> [[Degrees of freedom (statistics)degrees of freedom]] respectively, and 

+ 
*''U''<sub>1</sub> and ''U''<sub>2</sub> are [[statistical independenceindependent]]. 

+ 


+ 
In instances where the Fdistribution is used, for example in the [[analysis of variance]], independence of ''U''<sub>1</sub> and ''U''<sub>2</sub> might be demonstrated by applying [[Cochran's theorem]]. 

+ 


+ 
Equivalently, the random variable of the Fdistribution may also be written 

+ 


+ 
:<math>X = \frac{s_1^2}{\sigma_1^2} \;/\; \frac{s_2^2}{\sigma_2^2}</math> 

+ 


+ 
where ''s''<sub>1</sub><sup>2</sup> and ''s''<sub>2</sub><sup>2</sup> are the sums of squares ''S''<sub>1</sub><sup>2</sup> and ''S''<sub>2</sub><sup>2</sup> from two normal processes with variances σ<sub>1</sub><sup>2</sup> and σ<sub>2</sub><sup>2</sup> divided by the corresponding number of χ<sup>2</sup> degrees of freedom, ''d''<sub>1</sub> and ''d''<sub>2</sub> respectively. 

+ 


+ 
In a Frequentist context, a scaled Fdistribution therefore gives the probability ''p''(''s''<sub>1</sub><sup>2</sup>/''s''<sub>2</sub><sup>2</sup>  σ<sub>1</sub><sup>2</sup>, σ<sub>2</sub><sup>2</sup>), with the F distribution itself, without any scaling, applying where σ<sub>1</sub><sup>2</sup> is being taken equal to σ<sub>2</sub><sup>2</sup>. This is the context in which the Fdistribution most generally appears in [[Ftest]]s: where the null hypothesis is that two independent normal variances are equal, and the observed sums of some appropriately selected squares are then examined to see whether their ratio is significantly incompatible with this null hypothesis. 

+ 


+ 
The quantity ''X'' has the same distribution in Bayesian statistics, if an uninformative rescalinginvariant [[Jeffreys prior]] is taken for the [[prior probabilityprior probabilities]] of σ<sub>1</sub><sup>2</sup> and σ<sub>2</sub><sup>2</sup>.<ref>G.E.P. Box and G.C. Tiao (1973), ''Bayesian Inference in Statistical Analysis'', AddisonWesley. p.110</ref> In this context, a scaled Fdistribution thus gives the posterior probability ''p''(σ<sub>2</sub><sup>2</sup>/σ<sub>1</sub><sup>2</sup>''s''<sub>1</sub><sup>2</sup>, ''s''<sub>2</sub><sup>2</sup>), where now the observed sums ''s''<sub>1</sub><sup>2</sup> and ''s''<sub>2</sub><sup>2</sup> are what are taken as known. 

+ 


+ 
== Generalization == 

A generalization of the (central) Fdistribution is the [[noncentral Fdistribution]]. 

A generalization of the (central) Fdistribution is the [[noncentral Fdistribution]]. 




− 
==Related distributions== 
+ 
== Related distributions and properties == 
− 
*<math>Y \sim \chi^2</math> is a [[chisquare distribution]] as <math>Y = \lim_{\nu_2 \to \infty} \nu_1 X</math> for <math>X \sim \mathrm{F}(\nu_1, \nu_2)</math>. 
+ 
*If <math>X \sim \chi^2_{d_1}</math> and <math>Y \sim \chi^2_{d_2}</math> are [[independence (probability theory)independent]], then <math> \frac{X / d_1}{Y / d_2} \sim \mathrm{F}(d_1, d_2)</math> 

+ 
*If <math>X \sim \operatorname{Beta}(d_1/2,d_2/2)</math> ([[Beta distribution]]) then <math>\frac{d_2 X}{d_1(1X)} \sim \operatorname{F}(d_1,d_2)</math> 

+ 
*Equivalently, if ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>), then <math>\frac{d_1 X/d_2}{1+d_1 X/d_2} \sim \operatorname{Beta}(d_1/2,d_2/2)</math>. 

+ 
*If ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>) then <math>Y = \lim_{d_2 \to \infty} d_1 X</math> has the [[chisquared distribution]] <math>\chi^2_{d_1}</math> 

+ 
*F(''d''<sub>1</sub>, ''d''<sub>2</sub>) is equivalent to the scaled [[Hotelling's Tsquared distribution]] <math>\frac{d_2}{d_1(d_1+d_21)} \operatorname{T}^2 (d_1, d_1 +d_21) </math>. 

+ 
*If ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>) then ''X''<sup>−1</sup> ~ F(''d''<sub>2</sub>, ''d''<sub>1</sub>). 

+ 
*If ''X'' ~ [[Student's tdistributiont(''n'')]] then 

+ 
::<math>X^{2} \sim \operatorname{F}(1, n) </math> 

+ 
::<math>X^{2} \sim \operatorname{F}(n, 1) </math> 

+ 
*Fdistribution is a special case of type 6 [[Pearson distribution]] 

+ 


+ 
*If ''X'' and ''Y'' are independent, with ''X'', ''Y'' ~ [[Laplace distributionLaplace(μ, ''b'')]] then 

+ 
::<math> \tfrac{X\mu}{Y\mu} \sim \operatorname{F}(2,2) </math> 

+ 
*If ''X'' ~ F(''n'', ''m'') then <math>\tfrac{\log{X}}{2} \sim \operatorname{FisherZ}(n,m)</math> ([[Fisher's zdistribution]]) 

+ 
*The [[noncentral Fdistribution]] simplifies to the Fdistribution if λ = 0. 

+ 
*The doubly [[noncentral Fdistribution]] simplifies to the Fdistribution if <math> \lambda_1 = \lambda_2 = 0 </math> 

+ 


+ 
*If <math>\operatorname{Q}_X(p)</math> is the quantile ''p'' for ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>) and <math>\operatorname{Q}_Y(1p)</math> is the quantile 1−''p'' for ''Y'' ~ F(''d''<sub>2</sub>, ''d''<sub>1</sub>), then 

+ 
::<math>\operatorname{Q}_X(p)=\frac{1}{\operatorname{Q}_Y(1p)}</math>. 

+ 


+ 
== See also == 

+ 
{{Colbegin}} 

+ 
* [[Chisquared distribution]] 

+ 
* [[Chow test]] 

+ 
* [[Gamma distribution]] 

+ 
* [[Hotelling's Tsquared distribution]] 

+ 
* [[Student's tdistribution]] 

+ 
* [[Wilks' lambda distribution]] 

+ 
* [[Wishart distribution]] 

+ 
{{Colend}} 

+ 


+ 
== References == 

+ 
{{reflist}} 

+ 


+ 





− 
==See also== 





− 
==References & Bibliography== 
+ 
==Further reading== 





==Key texts== 

==Key texts== 
Line 71: 
Line 73: 

[[Category:Continuous distributions]] 

[[Category:Continuous distributions]] 





+ 
<! 

[[de:FVerteilung]] 

[[de:FVerteilung]] 

[[es:Distribución F]] 

[[es:Distribución F]] 

[[it:Variabile casuale F di Snedecor]] 

[[it:Variabile casuale F di Snedecor]] 

[[nl:Fverdeling]] 

[[nl:Fverdeling]] 

+ 
> 

+ 
{{enWPFdistribution}} 