Wikia

Psychology Wiki

Changes: F-distribution

Edit

Back to page

(Related distributions)
 
Line 1: Line 1:
  +
{{StatsPsy}}
  +
{{Distinguish2|[[F-statistics]] as used in [[population genetics]]}}
  +
 
{{Probability distribution |
 
{{Probability distribution |
 
name =Fisher-Snedecor|
 
name =Fisher-Snedecor|
 
type =density|
 
type =density|
pdf_image =None uploaded yet.|
+
pdf_image =[[Image:F distributionPDF.png|325px]]|
cdf_image =None uploaded yet.|
+
cdf_image =[[Image:F distributionCDF.png|325px]]|
parameters =<math>d_1>0,\ d_2>0</math> deg. of freedom|
+
parameters =''d''<sub>1</sub>, ''d''<sub>2</sub> > 0 deg. of freedom|
support =<math>x \in [0; +\infty)\!</math>|
+
support = ''x'' ∈ [0, +)|
 
pdf =<math>\frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}}
 
pdf =<math>\frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}}
 
{(d_1\,x+d_2)^{d_1+d_2}}}}
 
{(d_1\,x+d_2)^{d_1+d_2}}}}
 
{x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)}\!</math>|
 
{x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)}\!</math>|
cdf =<math>I_{\frac{d_1 x}{d_1 x + d_2}}(d_1/2, d_2/2)\!</math>|
+
cdf =<math>I_{\frac{d_1 x}{d_1 x + d_2}} \left(\tfrac{d_1}{2}, \tfrac{d_2}{2} \right)</math>|
mean =<math>\frac{d_2}{d_2-2}\!</math> for <math>d_2 > 2</math>|
+
mean =<math>\frac{d_2}{d_2-2}\!</math><br /> for ''d''<sub>2</sub> > 2|
 
median =|
 
median =|
mode =<math>\frac{d_1-2}{d_1}\;\frac{d_2}{d_2+2}\!</math> for <math>d_1 > 2</math>|
+
mode =<math>\frac{d_1-2}{d_1}\;\frac{d_2}{d_2+2}\!</math><br /> for ''d''<sub>1</sub> > 2|
variance =<math>\frac{2\,d_2^2\,(d_1+d_2-2)}{d_1 (d_2-2)^2 (d_2-4)}\!</math> for <math>d_2 > 4</math>|
+
variance =<math>\frac{2\,d_2^2\,(d_1+d_2-2)}{d_1 (d_2-2)^2 (d_2-4)}\!</math><br /> for ''d''<sub>2</sub> > 4|
skewness =<math>\frac{(2 d_1 + d_2 - 2) \sqrt{8 (d_2-4)}}{(d_2-6) \sqrt{d_1 (d_1 + d_2 -2)}}\!</math><br />for <math>d_2 > 6</math>|
+
skewness =<math>\frac{(2 d_1 + d_2 - 2) \sqrt{8 (d_2-4)}}{(d_2-6) \sqrt{d_1 (d_1 + d_2 -2)}}\!</math><br />for ''d''<sub>2</sub> > 6|
kurtosis =<!-- to do -->|
+
kurtosis =''see text''|
 
entropy =|
 
entropy =|
mgf =''see text for raw moments''|
+
mgf =''does not exist, raw moments defined in text and in <ref name=johnson /><ref name=abramowitz /> ''|
char =|
+
char =''see text''|}}
}}
+
In [[probability theory]] and [[statistics]], the '''F-distribution''' is a [[Continuous probability distribution|continuous]] [[probability distribution]].<ref name=johnson>{{cite book | last = Johnson
In [[probability theory]] and [[statistics]], the '''''F''-distribution''' is a [[continuous probability distribution|continuous]] [[probability distribution]]. It is also known as '''Snedecor's ''F'' distribution''' or the '''Fisher-Snedecor distribution''' (after [[Ronald Fisher]] and [[George W. Snedecor]]).
+
| first = Norman Lloyd
  +
| coauthors = Samuel Kotz, N. Balakrishnan
  +
| title = Continuous Univariate Distributions, Volume 2 (Second Edition, Section 27)
  +
| publisher = Wiley
  +
| year = 1995
  +
| isbn = 0-471-58494-0}}</ref><ref name=abramowitz>{{Abramowitz_Stegun_ref|26|946}}</ref><ref>NIST (2006). [http://www.itl.nist.gov/div898/handbook/eda/section3/eda3665.htm Engineering Statistics Handbook - F Distribution]</ref><ref>{{cite book | last = Mood
  +
| first = Alexander
  +
| coauthors = Franklin A. Graybill, Duane C. Boes
  +
| title = Introduction to the Theory of Statistics (Third Edition, p. 246-249)
  +
| publisher = McGraw-Hill
  +
| year = 1974
  +
| isbn = 0-07-042864-6}}</ref> It is also known as '''Snedecor's F distribution''' or the '''Fisher-Snedecor distribution''' (after [[Ronald Fisher|R.A. Fisher]] and [[George W. Snedecor]]). The F-distribution arises frequently as the [[null distribution]] of a [[test statistic]], most notably in the [[analysis of variance]]; see [[F-test]].
   
A [[random variate]] of the ''F''-distribution arises as the ratio of two [[chi-squared distribution|chi-squared]] variates:
+
==Definition==
  +
If a [[random variable]] ''X'' has an F-distribution with parameters ''d''<sub>1</sub> and ''d''<sub>2</sub>, we write ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>). Then the [[probability density function]] for ''X'' is given by
   
:<math>\frac{U_1/d_1}{U_2/d_2}</math>
+
:<math> \begin{align}
  +
f(x; d_1,d_2) &= \frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}} {(d_1\,x+d_2)^{d_1+d_2}}}} {x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \\
  +
&=\frac{1}{\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \left(\frac{d_1}{d_2}\right)^{\frac{d_1}{2}} x^{\frac{d_1}{2} - 1} \left(1+\frac{d_1}{d_2}\,x\right)^{-\frac{d_1+d_2}{2}}
  +
\end{align}</math>
   
where
+
for [[real number|real]] ''x'' ≥ 0. Here <math>\mathrm{B}</math> is the [[beta function]]. In many applications, the parameters ''d''<sub>1</sub> and ''d''<sub>2</sub> are [[positive integer]]s, but the distribution is well-defined for positive real values of these parameters.
   
*''U''<sub>''1''</sub> and ''U''<sub>2</sub> have [[chi-square distribution]]s with ''d''<sub>''1''</sub> and ''d''<sub>2</sub> [[Degrees of freedom (statistics)|degrees of freedom]] respectively, and
+
The [[cumulative distribution function]] is
   
*''U''<sub>1</sub> and ''U''<sub>2</sub> are [[statistical independence|independent]] (see [[Cochran's theorem]] for an application).
+
:<math>F(x; d_1,d_2)=I_{\frac{d_1 x}{d_1 x + d_2}}\left (\tfrac{d_1}{2}, \tfrac{d_2}{2} \right) ,</math>
   
The ''F''-distribution arises frequently as the null distribution of a test statistic, especially in [[likelihood-ratio test]]s, perhaps most notably in the [[analysis of variance]]; see [[F-test]].
+
where ''I'' is the [[regularized incomplete beta function]].
   
The [[probability density function]] of an ''F''(''d''<sub>1</sub>, ''d''<sub>2</sub>) distributed [[random variable]] is given by
+
The expectation, variance, and other details about the F(''d''<sub>1</sub>, ''d''<sub>2</sub>) are given in the sidebox; for ''d''<sub>2</sub> > 8, the [[excess kurtosis]] is
   
:<math> g(x) = \frac{1}{\mathrm{B}(d_1/2, d_2/2)} \; \left(\frac{d_1\,x}{d_1\,x + d_2}\right)^{d_1/2} \; \left(1-\frac{d_1\,x}{d_1\,x + d_2}\right)^{d_2/2} \; x^{-1} </math>
+
:<math>\gamma_2 = 12\frac{d_1(5d_2-22)(d_1+d_2-2)+(d_2-4)(d_2-2)^2}{d_1(d_2-6)(d_2-8)(d_1+d_2-2)}</math>.
   
for [[real number|real]] ''x'' &ge; 0, where ''d''<sub>1</sub> and ''d''<sub>2</sub> are [[positive integer]]s, and B is the [[beta function]].
+
The ''k''-th moment of an F(''d''<sub>1</sub>, ''d''<sub>2</sub>) distribution exists and is finite only when 2''k'' < ''d''<sub>2</sub> and it is equal to <ref name=taboga>{{cite web | last1 = Taboga | first1 = Marco | url = http://www.statlect.com/F_distribution.htm | title = The F distribution}}</ref>
   
The [[cumulative distribution function]] is
+
:<math>\mu _{X}(k) =\left( \frac{d_{2}}{d_{1}}\right)^{k}\frac{\Gamma \left(\tfrac{d_1}{2}+k\right) }{\Gamma \left(\tfrac{d_1}{2}\right) }\frac{\Gamma \left(\tfrac{d_2}{2}-k\right) }{\Gamma \left( \tfrac{d_2}{2}\right) }</math>
   
:<math> G(x) = I_{\frac{d_1 x}{d_1 x + d_2}}(d_1/2, d_2/2) </math>
+
The ''F''-distribution is a particular parametrization of the [[beta prime distribution]], which is also called the beta distribution of the second kind.
   
where ''I'' is the [[regularized incomplete beta function]].
+
The [[Characteristic function (probability theory)|characteristic function]] is listed incorrectly in many standard references (e.g., <ref name=abramowitz />). The correct expression <ref>Phillips, P. C. B. (1982) "The true characteristic function of the F distribution," ''[[Biometrika]]'', 69: 261-264 {{jstor|2335882}}</ref> is
   
== Generalization ==
+
:<math>\varphi^F_{d_1, d_2}(s) = \frac{\Gamma(\frac{d_1+d_2}{2})}{\Gamma(\tfrac{d_2}{2})} U \! \left(\frac{d_1}{2},1-\frac{d_2}{2},-\frac{d_2}{d_1} \imath s \right)</math>
  +
  +
where ''U''(''a'', ''b'', ''z'') is the [[confluent hypergeometric function]] of the second kind.
  +
  +
==Characterization==
  +
A [[random variate]] of the F-distribution with parameters ''d''<sub>1</sub> and ''d''<sub>2</sub> arises as the ratio of two appropriately scaled [[chi-squared distribution|chi-squared]] variates:<ref>M.H. DeGroot (1986), ''Probability and Statistics'' (2nd Ed), Addison-Wesley. ISBN 0-201-11366-X, p. 500</ref>
  +
  +
:<math>X = \frac{U_1/d_1}{U_2/d_2}</math>
  +
  +
where
  +
  +
*''U''<sub>1</sub> and ''U''<sub>2</sub> have [[chi-squared distribution]]s with ''d''<sub>1</sub> and ''d''<sub>2</sub> [[Degrees of freedom (statistics)|degrees of freedom]] respectively, and
  +
*''U''<sub>1</sub> and ''U''<sub>2</sub> are [[statistical independence|independent]].
  +
  +
In instances where the F-distribution is used, for example in the [[analysis of variance]], independence of ''U''<sub>1</sub> and ''U''<sub>2</sub> might be demonstrated by applying [[Cochran's theorem]].
  +
  +
Equivalently, the random variable of the F-distribution may also be written
  +
  +
:<math>X = \frac{s_1^2}{\sigma_1^2} \;/\; \frac{s_2^2}{\sigma_2^2}</math>
  +
  +
where ''s''<sub>1</sub><sup>2</sup> and ''s''<sub>2</sub><sup>2</sup> are the sums of squares ''S''<sub>1</sub><sup>2</sup> and ''S''<sub>2</sub><sup>2</sup> from two normal processes with variances σ<sub>1</sub><sup>2</sup> and σ<sub>2</sub><sup>2</sup> divided by the corresponding number of χ<sup>2</sup> degrees of freedom, ''d''<sub>1</sub> and ''d''<sub>2</sub> respectively.
  +
  +
In a Frequentist context, a scaled F-distribution therefore gives the probability ''p''(''s''<sub>1</sub><sup>2</sup>/''s''<sub>2</sub><sup>2</sup> | σ<sub>1</sub><sup>2</sup>, σ<sub>2</sub><sup>2</sup>), with the F distribution itself, without any scaling, applying where σ<sub>1</sub><sup>2</sup> is being taken equal to σ<sub>2</sub><sup>2</sup>. This is the context in which the F-distribution most generally appears in [[F-test]]s: where the null hypothesis is that two independent normal variances are equal, and the observed sums of some appropriately selected squares are then examined to see whether their ratio is significantly incompatible with this null hypothesis.
  +
  +
The quantity ''X'' has the same distribution in Bayesian statistics, if an uninformative rescaling-invariant [[Jeffreys prior]] is taken for the [[prior probability|prior probabilities]] of σ<sub>1</sub><sup>2</sup> and σ<sub>2</sub><sup>2</sup>.<ref>G.E.P. Box and G.C. Tiao (1973), ''Bayesian Inference in Statistical Analysis'', Addison-Wesley. p.110</ref> In this context, a scaled F-distribution thus gives the posterior probability ''p''(σ<sub>2</sub><sup>2</sup>/σ<sub>1</sub><sup>2</sup>|''s''<sub>1</sub><sup>2</sup>, ''s''<sub>2</sub><sup>2</sup>), where now the observed sums ''s''<sub>1</sub><sup>2</sup> and ''s''<sub>2</sub><sup>2</sup> are what are taken as known.
  +
  +
== Generalization ==
 
A generalization of the (central) F-distribution is the [[noncentral F-distribution]].
 
A generalization of the (central) F-distribution is the [[noncentral F-distribution]].
   
==Related distributions==
+
== Related distributions and properties ==
*<math>Y \sim \chi^2</math> is a [[chi-square distribution]] as <math>Y = \lim_{\nu_2 \to \infty} \nu_1 X</math> for <math>X \sim \mathrm{F}(\nu_1, \nu_2)</math>.
+
*If <math>X \sim \chi^2_{d_1}</math> and <math>Y \sim \chi^2_{d_2}</math> are [[independence (probability theory)|independent]], then <math> \frac{X / d_1}{Y / d_2} \sim \mathrm{F}(d_1, d_2)</math>
  +
*If <math>X \sim \operatorname{Beta}(d_1/2,d_2/2)</math> ([[Beta distribution]]) then <math>\frac{d_2 X}{d_1(1-X)} \sim \operatorname{F}(d_1,d_2)</math>
  +
*Equivalently, if ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>), then <math>\frac{d_1 X/d_2}{1+d_1 X/d_2} \sim \operatorname{Beta}(d_1/2,d_2/2)</math>.
  +
*If ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>) then <math>Y = \lim_{d_2 \to \infty} d_1 X</math> has the [[chi-squared distribution]] <math>\chi^2_{d_1}</math>
  +
*F(''d''<sub>1</sub>, ''d''<sub>2</sub>) is equivalent to the scaled [[Hotelling's T-squared distribution]] <math>\frac{d_2}{d_1(d_1+d_2-1)} \operatorname{T}^2 (d_1, d_1 +d_2-1) </math>.
  +
*If ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>) then ''X''<sup>−1</sup> ~ F(''d''<sub>2</sub>, ''d''<sub>1</sub>).
  +
*If ''X'' ~ [[Student's t-distribution|t(''n'')]] then
  +
::<math>X^{2} \sim \operatorname{F}(1, n) </math>
  +
::<math>X^{-2} \sim \operatorname{F}(n, 1) </math>
  +
*F-distribution is a special case of type 6 [[Pearson distribution]]
  +
  +
*If ''X'' and ''Y'' are independent, with ''X'', ''Y'' ~ [[Laplace distribution|Laplace(μ, ''b'')]] then
  +
::<math> \tfrac{|X-\mu|}{|Y-\mu|} \sim \operatorname{F}(2,2) </math>
  +
*If ''X'' ~ F(''n'', ''m'') then <math>\tfrac{\log{X}}{2} \sim \operatorname{FisherZ}(n,m)</math> ([[Fisher's z-distribution]])
  +
*The [[noncentral F-distribution]] simplifies to the F-distribution if λ = 0.
  +
*The doubly [[noncentral F-distribution]] simplifies to the F-distribution if <math> \lambda_1 = \lambda_2 = 0 </math>
  +
  +
*If <math>\operatorname{Q}_X(p)</math> is the quantile ''p'' for ''X'' ~ F(''d''<sub>1</sub>, ''d''<sub>2</sub>) and <math>\operatorname{Q}_Y(1-p)</math> is the quantile 1−''p'' for ''Y'' ~ F(''d''<sub>2</sub>, ''d''<sub>1</sub>), then
  +
::<math>\operatorname{Q}_X(p)=\frac{1}{\operatorname{Q}_Y(1-p)}</math>.
  +
  +
== See also ==
  +
{{Colbegin}}
  +
* [[Chi-squared distribution]]
  +
* [[Chow test]]
  +
* [[Gamma distribution]]
  +
* [[Hotelling's T-squared distribution]]
  +
* [[Student's t-distribution]]
  +
* [[Wilks' lambda distribution]]
  +
* [[Wishart distribution]]
  +
{{Colend}}
  +
  +
== References ==
  +
{{reflist}}
  +
  +
   
==See also==
 
   
==References & Bibliography==
+
==Further reading==
   
 
==Key texts==
 
==Key texts==
Line 71: Line 73:
 
[[Category:Continuous distributions]]
 
[[Category:Continuous distributions]]
   
  +
<!--
 
[[de:F-Verteilung]]
 
[[de:F-Verteilung]]
 
[[es:Distribución F]]
 
[[es:Distribución F]]
 
[[it:Variabile casuale F di Snedecor]]
 
[[it:Variabile casuale F di Snedecor]]
 
[[nl:F-verdeling]]
 
[[nl:F-verdeling]]
  +
-->
  +
{{enWP|F-distribution}}

Latest revision as of 19:08, July 3, 2013

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory


Fisher-Snedecor
Probability density function
325px
Cumulative distribution function
325px
Parameters d1, d2 > 0 deg. of freedom
Support x ∈ [0, +∞)
pdf \frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}}
{(d_1\,x+d_2)^{d_1+d_2}}}}
{x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)}\!
cdf I_{\frac{d_1 x}{d_1 x + d_2}} \left(\tfrac{d_1}{2}, \tfrac{d_2}{2} \right)
Mean \frac{d_2}{d_2-2}\!
for d2 > 2
Median
Mode \frac{d_1-2}{d_1}\;\frac{d_2}{d_2+2}\!
for d1 > 2
Variance \frac{2\,d_2^2\,(d_1+d_2-2)}{d_1 (d_2-2)^2 (d_2-4)}\!
for d2 > 4
Skewness \frac{(2 d_1 + d_2 - 2) \sqrt{8 (d_2-4)}}{(d_2-6) \sqrt{d_1 (d_1 + d_2 -2)}}\!
for d2 > 6
Kurtosis see text
Entropy
mgf does not exist, raw moments defined in text and in [1][2]
Char. func. see text

In probability theory and statistics, the F-distribution is a continuous probability distribution.[1][2][3][4] It is also known as Snedecor's F distribution or the Fisher-Snedecor distribution (after R.A. Fisher and George W. Snedecor). The F-distribution arises frequently as the null distribution of a test statistic, most notably in the analysis of variance; see F-test.

DefinitionEdit

If a random variable X has an F-distribution with parameters d1 and d2, we write X ~ F(d1, d2). Then the probability density function for X is given by

 \begin{align} 
f(x; d_1,d_2) &= \frac{\sqrt{\frac{(d_1\,x)^{d_1}\,\,d_2^{d_2}} {(d_1\,x+d_2)^{d_1+d_2}}}} {x\,\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \\
&=\frac{1}{\mathrm{B}\!\left(\frac{d_1}{2},\frac{d_2}{2}\right)} \left(\frac{d_1}{d_2}\right)^{\frac{d_1}{2}} x^{\frac{d_1}{2} - 1} \left(1+\frac{d_1}{d_2}\,x\right)^{-\frac{d_1+d_2}{2}}
\end{align}

for real x ≥ 0. Here \mathrm{B} is the beta function. In many applications, the parameters d1 and d2 are positive integers, but the distribution is well-defined for positive real values of these parameters.

The cumulative distribution function is

F(x; d_1,d_2)=I_{\frac{d_1 x}{d_1 x + d_2}}\left (\tfrac{d_1}{2}, \tfrac{d_2}{2} \right) ,

where I is the regularized incomplete beta function.

The expectation, variance, and other details about the F(d1, d2) are given in the sidebox; for d2 > 8, the excess kurtosis is

\gamma_2 = 12\frac{d_1(5d_2-22)(d_1+d_2-2)+(d_2-4)(d_2-2)^2}{d_1(d_2-6)(d_2-8)(d_1+d_2-2)}.

The k-th moment of an F(d1, d2) distribution exists and is finite only when 2k < d2 and it is equal to [5]

\mu _{X}(k) =\left( \frac{d_{2}}{d_{1}}\right)^{k}\frac{\Gamma \left(\tfrac{d_1}{2}+k\right) }{\Gamma \left(\tfrac{d_1}{2}\right) }\frac{\Gamma \left(\tfrac{d_2}{2}-k\right) }{\Gamma \left( \tfrac{d_2}{2}\right) }

The F-distribution is a particular parametrization of the beta prime distribution, which is also called the beta distribution of the second kind.

The characteristic function is listed incorrectly in many standard references (e.g., [2]). The correct expression [6] is

\varphi^F_{d_1, d_2}(s) = \frac{\Gamma(\frac{d_1+d_2}{2})}{\Gamma(\tfrac{d_2}{2})} U \! \left(\frac{d_1}{2},1-\frac{d_2}{2},-\frac{d_2}{d_1} \imath s \right)

where U(a, b, z) is the confluent hypergeometric function of the second kind.

CharacterizationEdit

A random variate of the F-distribution with parameters d1 and d2 arises as the ratio of two appropriately scaled chi-squared variates:[7]

X = \frac{U_1/d_1}{U_2/d_2}

where

In instances where the F-distribution is used, for example in the analysis of variance, independence of U1 and U2 might be demonstrated by applying Cochran's theorem.

Equivalently, the random variable of the F-distribution may also be written

X = \frac{s_1^2}{\sigma_1^2} \;/\; \frac{s_2^2}{\sigma_2^2}

where s12 and s22 are the sums of squares S12 and S22 from two normal processes with variances σ12 and σ22 divided by the corresponding number of χ2 degrees of freedom, d1 and d2 respectively.

In a Frequentist context, a scaled F-distribution therefore gives the probability p(s12/s22 | σ12, σ22), with the F distribution itself, without any scaling, applying where σ12 is being taken equal to σ22. This is the context in which the F-distribution most generally appears in F-tests: where the null hypothesis is that two independent normal variances are equal, and the observed sums of some appropriately selected squares are then examined to see whether their ratio is significantly incompatible with this null hypothesis.

The quantity X has the same distribution in Bayesian statistics, if an uninformative rescaling-invariant Jeffreys prior is taken for the prior probabilities of σ12 and σ22.[8] In this context, a scaled F-distribution thus gives the posterior probability p2212|s12, s22), where now the observed sums s12 and s22 are what are taken as known.

Generalization Edit

A generalization of the (central) F-distribution is the noncentral F-distribution.

Related distributions and properties Edit

X^{2} \sim \operatorname{F}(1, n)
X^{-2} \sim \operatorname{F}(n, 1)
 \tfrac{|X-\mu|}{|Y-\mu|} \sim \operatorname{F}(2,2)
  • If \operatorname{Q}_X(p) is the quantile p for X ~ F(d1, d2) and \operatorname{Q}_Y(1-p) is the quantile 1−p for Y ~ F(d2, d1), then
\operatorname{Q}_X(p)=\frac{1}{\operatorname{Q}_Y(1-p)}.

See also Edit

Template:Colbegin

Template:Colend

References Edit

  1. 1.0 1.1 Johnson, Norman Lloyd; Samuel Kotz, N. Balakrishnan (1995). Continuous Univariate Distributions, Volume 2 (Second Edition, Section 27), Wiley.
  2. 2.0 2.1 2.2 Template:Abramowitz Stegun ref
  3. NIST (2006). Engineering Statistics Handbook - F Distribution
  4. Mood, Alexander; Franklin A. Graybill, Duane C. Boes (1974). Introduction to the Theory of Statistics (Third Edition, p. 246-249), McGraw-Hill.
  5. The F distribution.
  6. Phillips, P. C. B. (1982) "The true characteristic function of the F distribution," Biometrika, 69: 261-264 Template:Jstor
  7. M.H. DeGroot (1986), Probability and Statistics (2nd Ed), Addison-Wesley. ISBN 0-201-11366-X, p. 500
  8. G.E.P. Box and G.C. Tiao (1973), Bayesian Inference in Statistical Analysis, Addison-Wesley. p.110



Further readingEdit

Key textsEdit

BooksEdit

PapersEdit

Additional materialEdit

BooksEdit

PapersEdit

External linksEdit

This page uses Creative Commons Licensed content from Wikipedia (view authors).

Around Wikia's network

Random Wiki