Main Page: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
mNo edit summary
No edit summary
 
(641 intermediate revisions by more than 100 users not shown)
Line 1: Line 1:
{{Otheruses4|the mathematics of the chi-squared distribution|its uses in statistics|chi-squared test|the music group|Chi2 (band)}}
This is a preview for the new '''MathML rendering mode''' (with SVG fallback), which is availble in production for registered users.


{{Probability distribution
If you would like use the '''MathML''' rendering mode, you need a wikipedia user account that can be registered here [[https://en.wikipedia.org/wiki/Special:UserLogin/signup]]
  | type      = density
* Only registered users will be able to execute this rendering mode.
  | pdf_image  = [[File:chi-square pdf.svg|325px]]
* Note: you need not enter a email address (nor any other private information). Please do not use a password that you use elsewhere.
  | cdf_image  = [[File:chi-square distributionCDF.svg|325px]]
  | notation  = <math>\chi^2(k)\!</math> or <math>\chi^2_k\!</math>
  | parameters = <math>k \in \mathbb{N}~~</math>  (known as "degrees of freedom")
  | support    = ''x'' ∈ [0, +∞)
  | pdf        = <math>\frac{1}{2^{\frac{k}{2}}\Gamma\left(\frac{k}{2}\right)}\; x^{\frac{k}{2}-1} e^{-\frac{x}{2}}\,</math>
  | cdf        = <math>\frac{1}{\Gamma\left(\frac{k}{2}\right)}\;\gamma\left(\frac{k}{2},\,\frac{x}{2}\right)</math>
  | mean      = ''k''
  | median    = <math>\approx k\bigg(1-\frac{2}{9k}\bigg)^3</math>
  | mode       = max{&thinsp;''k'' − 2, 0&thinsp;}
  | variance  = 2''k''
  | skewness  = <math>\scriptstyle\sqrt{8/k}\,</math>
  | kurtosis  = 12&thinsp;/&thinsp;''k''
  | entropy    = <math>\frac{k}{2}\!+\!\ln(2\Gamma(k/2))\!+\!(1\!-\!k/2)\psi(k/2)</math>
  | mgf        = {{nowrap|(1 − 2&thinsp;''t'')<sup>−''k''/2</sup>}} &nbsp; for &thinsp;t&thinsp; < ½
  | char      = {{nowrap|(1 − 2&thinsp;''i''&thinsp;''t'')<sup>−''k''/2</sup>}}&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<ref>{{cite web | url=http://www.planetmathematics.com/CentralChiDistr.pdf | title=Characteristic function of the central chi-squared distribution | author=M.A. Sanders | accessdate=2009-03-06}}</ref>
  }}


In [[probability theory]] and [[statistics]], the '''chi-squared distribution''' (also '''chi-square''' or {{nowrap|1='''[[chi (letter)|<span style="font-family:serif">''χ''</span>]]²-distribution'''}}) with ''k'' [[Degrees of freedom (statistics)|degrees of freedom]] is the distribution of a sum of the squares of ''k'' [[Independence (probability theory)|independent]] [[standard normal]] random variables. It is one of the most widely used [[probability distribution]]s in [[inferential statistics]], e.g., in [[hypothesis testing]] or in construction of [[confidence interval]]s.<ref name=abramowitz>{{Abramowitz_Stegun_ref|26|940}}</ref><ref>NIST (2006).  [http://www.itl.nist.gov/div898/handbook/eda/section3/eda3666.htm Engineering Statistics Handbook - Chi-Squared Distribution]</ref><ref>{{cite book
Registered users will be able to choose between the following three rendering modes:  
  | last = Jonhson
  | first = N.L.
  | coauthors = S. Kotz, , N. Balakrishnan
  | title = Continuous Univariate Distributions (Second Ed., Vol. 1, Chapter 18)
  | publisher = John Willey and Sons
  | year = 1994
  | isbn = 0-471-58495-9
}}</ref><ref>{{cite book
  | last = Mood
  | first = Alexander
  | coauthors = Franklin A. Graybill, Duane C. Boes
  | title = Introduction to the Theory of Statistics (Third Edition, p. 241-246)
  | publisher = McGraw-Hill
  | year = 1974
  | isbn = 0-07-042864-6
}}</ref> When there is a need to contrast it with the [[noncentral chi-squared distribution]], this distribution is sometimes called the '''central chi-squared distribution'''.


The chi-squared distribution is used in the common [[chi-squared test]]s for [[goodness of fit]] of an observed distribution to a theoretical one, the [[statistical independence|independence]] of two criteria of classification of [[data analysis|qualitative data]], and in [[confidence interval]] estimation for a population [[standard deviation]] of a normal distribution from a sample standard deviation.  Many other statistical tests also use this distribution, like [[Friedman test|Friedman's analysis of variance by ranks]].
'''MathML'''
:<math forcemathmode="mathml">E=mc^2</math>


The chi-squared distribution is a special case of the [[gamma distribution]].
<!--'''PNG'''  (currently default in production)
:<math forcemathmode="png">E=mc^2</math>


==Definition==
'''source'''
If ''Z''<sub>1</sub>, ..., ''Z''<sub>''k''</sub> are [[independence (probability theory)|independent]], [[standard normal]] random variables, then the sum of their squares,
:<math forcemathmode="source">E=mc^2</math> -->
: <math>
    Q\ = \sum_{i=1}^k Z_i^2 ,
  </math>
is distributed according to the '''chi-squared distribution''' with ''k'' degrees of freedom. This is usually denoted as
: <math>
    Q\ \sim\ \chi^2(k)\ \ \text{or}\ \ Q\ \sim\ \chi^2_k .
  </math>


The chi-squared distribution has one parameter: ''k'' — a positive integer that specifies the number of [[degrees of freedom (statistics)|degrees of freedom]] (i.e. the number of ''Z''<sub>''i''</sub>’s)
<span style="color: red">Follow this [https://en.wikipedia.org/wiki/Special:Preferences#mw-prefsection-rendering link] to change your Math rendering settings.</span> You can also add a [https://en.wikipedia.org/wiki/Special:Preferences#mw-prefsection-rendering-skin Custom CSS] to force the MathML/SVG rendering or select different font families. See [https://www.mediawiki.org/wiki/Extension:Math#CSS_for_the_MathML_with_SVG_fallback_mode these examples].


==Characteristics==
==Demos==
Further properties of the chi-squared distribution can be found in the box at the upper right corner of this article.


===Probability density function===
Here are some [https://commons.wikimedia.org/w/index.php?title=Special:ListFiles/Frederic.wang demos]:
The [[probability density function]] (pdf) of the chi-squared distribution is
:<math>
f(x;\,k) =
\begin{cases}
  \frac{x^{(k/2)-1} e^{-x/2}}{2^{k/2} \Gamma\left(\frac{k}{2}\right)},  & x \geq 0; \\ 0, & \text{otherwise}.
\end{cases}
</math>
where Γ(''k''/2) denotes the [[Gamma function]], which has [[particular values of the Gamma function|closed-form values for odd ''k'']].


For derivations of the pdf in the cases of one and two degrees of freedom, see [[Proofs related to chi-squared distribution]].


===Cumulative distribution function===
* accessibility:
Its [[cumulative distribution function]] is:
** Safari + VoiceOver: [https://commons.wikimedia.org/wiki/File:VoiceOver-Mac-Safari.ogv video only], [[File:Voiceover-mathml-example-1.wav|thumb|Voiceover-mathml-example-1]], [[File:Voiceover-mathml-example-2.wav|thumb|Voiceover-mathml-example-2]], [[File:Voiceover-mathml-example-3.wav|thumb|Voiceover-mathml-example-3]], [[File:Voiceover-mathml-example-4.wav|thumb|Voiceover-mathml-example-4]], [[File:Voiceover-mathml-example-5.wav|thumb|Voiceover-mathml-example-5]], [[File:Voiceover-mathml-example-6.wav|thumb|Voiceover-mathml-example-6]], [[File:Voiceover-mathml-example-7.wav|thumb|Voiceover-mathml-example-7]]
: <math>
** [https://commons.wikimedia.org/wiki/File:MathPlayer-Audio-Windows7-InternetExplorer.ogg Internet Explorer + MathPlayer (audio)]
    F(x;\,k) = \frac{\gamma(\frac{k}{2},\,\frac{x}{2})}{\Gamma(\frac{k}{2})} = P\left(\frac{k}{2},\,\frac{x}{2}\right),
** [https://commons.wikimedia.org/wiki/File:MathPlayer-SynchronizedHighlighting-WIndows7-InternetExplorer.png Internet Explorer + MathPlayer (synchronized highlighting)]
  </math>
** [https://commons.wikimedia.org/wiki/File:MathPlayer-Braille-Windows7-InternetExplorer.png Internet Explorer + MathPlayer (braille)]
where γ(''k'',''z'') is the [[incomplete Gamma function|lower incomplete Gamma function]] and ''P''(''k'',''z'') is the [[regularized Gamma function]].
** NVDA+MathPlayer: [[File:Nvda-mathml-example-1.wav|thumb|Nvda-mathml-example-1]], [[File:Nvda-mathml-example-2.wav|thumb|Nvda-mathml-example-2]], [[File:Nvda-mathml-example-3.wav|thumb|Nvda-mathml-example-3]], [[File:Nvda-mathml-example-4.wav|thumb|Nvda-mathml-example-4]], [[File:Nvda-mathml-example-5.wav|thumb|Nvda-mathml-example-5]], [[File:Nvda-mathml-example-6.wav|thumb|Nvda-mathml-example-6]], [[File:Nvda-mathml-example-7.wav|thumb|Nvda-mathml-example-7]].
** Orca: There is ongoing work, but no support at all at the moment [[File:Orca-mathml-example-1.wav|thumb|Orca-mathml-example-1]], [[File:Orca-mathml-example-2.wav|thumb|Orca-mathml-example-2]], [[File:Orca-mathml-example-3.wav|thumb|Orca-mathml-example-3]], [[File:Orca-mathml-example-4.wav|thumb|Orca-mathml-example-4]], [[File:Orca-mathml-example-5.wav|thumb|Orca-mathml-example-5]], [[File:Orca-mathml-example-6.wav|thumb|Orca-mathml-example-6]], [[File:Orca-mathml-example-7.wav|thumb|Orca-mathml-example-7]].
** From our testing, ChromeVox and JAWS are not able to read the formulas generated by the MathML mode.


In a special case of ''k'' = 2 this function has a simple form:
==Test pages ==
: <math>
    F(x;\,2) = 1 - e^{-\frac{x}{2}}.
  </math>


For the cases when ''0'' < ''z'' < ''1'' (which include all of the cases when this CDF is less than half), the following [[Chernoff_bound#The_first_step_in_the_proof_of_Chernoff_bounds| Chernoff upper bound]] may be obtained:<ref>{{cite journal |last1=Dasgupta |first1=Sanjoy D. A. |last2=Gupta |first2=Anupam K. |year=2002 |title=An Elementary Proof of a Theorem of Johnson and Lindenstrauss |journal=Random Structures and Algorithms |volume=22 |issue= |pages=60-65 |publisher= |doi= |url=http://cseweb.ucsd.edu/~dasgupta/papers/jl.pdf |accessdate=2012-05-01 }}</ref>
To test the '''MathML''', '''PNG''', and '''source''' rendering modes, please go to one of the following test pages:
: <math>
*[[Displaystyle]]
    F(z k;\,k) \leq (z e^{1-z})^{k/2}.
*[[MathAxisAlignment]]
  </math>
*[[Styling]]
*[[Linebreaking]]
*[[Unique Ids]]
*[[Help:Formula]]


The tail bound for the cases when ''z'' > ''1'' follows similarly
*[[Inputtypes|Inputtypes (private Wikis only)]]
: <math>
*[[Url2Image|Url2Image (private Wikis only)]]
    1-F(z k;\,k) \leq (z e^{1-z})^{k/2}.
==Bug reporting==
  </math>
If you find any bugs, please report them at [https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki%20extensions&component=Math&version=master&short_desc=Math-preview%20rendering%20problem Bugzilla], or write an email to math_bugs (at) ckurs (dot) de .
 
 
Tables of this cumulative distribution function are widely available and the function is included in many [[spreadsheet]]s and all [[List of statistical packages|statistical packages]]. For another [[approximation]] for the CDF modeled after the cube of a Gaussian, see [[Noncentral_chi-squared_distribution#Approximation|under Noncentral chi-squared distribution]].
 
===Additivity===
It follows from the definition of the chi-squared distribution that the sum of independent chi-squared variables is also chi-squared distributed. Specifically, if {''X<sub>i</sub>''}<sub>''i''=1</sub><sup>''n''</sup> are independent chi-squared variables with {''k<sub>i</sub>''}<sub>''i''=1</sub><sup>''n''</sup> degrees of freedom, respectively, then {{nowrap|''Y {{=}} X''<sub>1</sub> + ⋯ + ''X<sub>n</sub>''}} is chi-squared distributed with {{nowrap|''k''<sub>1</sub> + ⋯ + ''k<sub>n</sub>''}} degrees of freedom.
 
===Information entropy===
The [[information entropy]] is given by
: <math>
    H = \int_{-\infty}^\infty f(x;\,k)\ln f(x;\,k) \, dx
      = \frac{k}{2} + \ln\left(2\Gamma\left(\frac{k}{2}\right)\right) + \left(1-\frac{k}{2}\right) \psi\left(\frac{k}{2}\right),
  </math>
where ''ψ''(''x'') is the [[Digamma function]].
 
The Chi-squared distribution is the [[maximum entropy probability distribution]] for a random variate ''X'' for which <math>E(X)=\nu</math> and <math>E(\ln(X))=\psi\left(\frac{1}{2}\right)+\ln(2)</math> are fixed. <ref>{{cite journal |last1=Park |first1=Sung Y. |last2=Bera |first2=Anil K. |year=2009 |title=Maximum entropy autoregressive conditional heteroskedasticity model |journal=Journal of Econometrics |volume= |issue= |pages=219–230 |publisher=Elsevier |doi= |url=http://www.wise.xmu.edu.cn/Master/Download/..%5C..%5CUploadFiles%5Cpaper-masterdownload%5C2009519932327055475115776.pdf |accessdate=2011-06-02 }}</ref>
 
===Noncentral moments===
The moments about zero of a chi-squared distribution with ''k'' degrees of freedom are given by<ref>[http://mathworld.wolfram.com/Chi-SquaredDistribution.html Chi-squared distribution], from [[MathWorld]], retrieved Feb. 11, 2009</ref><ref>M. K. Simon, ''Probability Distributions Involving Gaussian Random Variables'', New York: Springer, 2002, eq. (2.35), ISBN 978-0-387-34657-1</ref>
: <math>
    \operatorname{E}(X^m) = k (k+2) (k+4) \cdots (k+2m-2) = 2^m \frac{\Gamma(m+\frac{k}{2})}{\Gamma(\frac{k}{2})}.
  </math>
 
===Cumulants===
The [[cumulant]]s are readily obtained by a (formal) power series expansion of the logarithm of the characteristic function:
: <math>
    \kappa_n = 2^{n-1}(n-1)!\,k
  </math>
 
===Asymptotic properties===
By the [[central limit theorem]], because the chi-squared distribution is the sum of ''k'' independent random variables with finite mean and variance, it converges to a normal distribution for large ''k''. For many practical purposes, for ''k''&nbsp;>&nbsp;50 the distribution is sufficiently close to a [[normal distribution]] for the difference to be ignored.<ref>{{cite book|title=Statistics for experimenters|author=Box, Hunter and Hunter|publisher=Wiley|page=46}}</ref> Specifically, if ''X''&nbsp;~&nbsp;''χ''²(''k''), then as ''k'' tends to infinity, the distribution of <math>(X-k)/\sqrt{2k}</math> [[convergence of random variables#Convergence in distribution|tends]] to a standard normal distribution. However, convergence is slow as the [[skewness]] is <math>\sqrt{8/k}</math> and the [[excess kurtosis]] is 12/''k''. Other functions of the chi-squared distribution converge more rapidly to a normal distribution. Some examples are:
* If ''X'' ~ ''χ''²(''k'') then <math>\scriptstyle\sqrt{2X}</math> is approximately normally distributed with mean <math>\scriptstyle\sqrt{2k-1}</math> and unit variance (result credited to [[R. A. Fisher]]).
* If ''X'' ~ ''χ''²(''k'') then <math>\scriptstyle\sqrt[3]{X/k}</math> is approximately normally distributed with mean <math>\scriptstyle 1-2/(9k)</math> and variance <math>\scriptstyle 2/(9k) .</math><ref>Wilson, E.B.; Hilferty, M.M. (1931) "The distribution of chi-squared". ''Proceedings of the National Academy of Sciences, Washington'', 17, 684–688.
</ref> This is known as the Wilson-Hilferty transformation.
 
==Relation to other distributions==
{{Ref improve section|date=September 2011}}
 
[[File:Chi_on_SAS.png|thumb|right|400px|Approximate formula for median compared with numerical quantile (top) as presented in [[SAS (software) | SAS Software]]. Difference between numerical quantile and approximate formula (bottom).]]
* As <math>k\to\infty</math>, <math> (\chi^2_k-k)/\sqrt{2k}  \xrightarrow{d}\ N(0,1) \,</math> ([[normal distribution]])
 
*<math> \chi_k^2 \sim  {\chi'}^2_k(0)</math> ([[Noncentral chi-squared distribution]] with non-centrality parameter <math> \lambda = 0 </math>)
 
*If <math>X \sim \mathrm{F}(\nu_1, \nu_2)</math> then <math>Y = \lim_{\nu_2 \to \infty} \nu_1 X</math> has the [[chi-squared distribution]] <math>\chi^2_{\nu_{1}}</math>
 
*As a special case, if <math>X \sim \mathrm{F}(1, \nu_2)\,</math> then <math>Y = \lim_{\nu_2 \to \infty} X\,</math> has the [[chi-squared distribution]] <math>\chi^2_{1}</math>
 
*<math> \|\boldsymbol{N}_{i=1,...,k}{(0,1)}\|^2 \sim \chi^2_k </math> (The squared [[Norm (mathematics)|norm]] of '''n''' standard normally distributed variables is a chi-squared distribution with '''k''' [[degrees of freedom (statistics)|degrees of freedom]])
 
*If <math>X \sim {\chi}^2(\nu)\,</math> and <math>c>0 \,</math>, then <math>cX \sim {\Gamma}(k=\nu/2, \theta=2c)\,</math>. ([[gamma distribution]])
 
*If <math>X \sim \chi^2_k</math> then <math>\sqrt{X} \sim \chi_k</math> ([[chi distribution]])
 
*If <math>X \sim \mathrm{Rayleigh}(1)\,</math> ([[Rayleigh distribution]]) then <math>X^2 \sim \chi^2(2)\,</math>
 
*If <math>X \sim \mathrm{Maxwell}(1)\,</math> ([[Maxwell distribution]])  then <math>X^2 \sim \chi^2(3)\,</math>
 
*If <math>X \sim \chi^2(\nu)</math> then <math>\tfrac{1}{X} \sim \mbox{Inv-}\chi^2(\nu)\, </math> ([[Inverse-chi-squared distribution]])
 
*The chi-squared distribution is a special case of type 3 [[Pearson distribution]]
 
* If <math>X \sim \chi^2(\nu_1)\,</math> and <math>Y \sim \chi^2(\nu_2)\,</math> are independent then <math>\tfrac{X}{X+Y} \sim {\rm Beta}(\tfrac{\nu_1}{2}, \tfrac{\nu_2}{2})\,</math> ([[beta distribution]])
 
*If <math> X \sim {\rm U}(0,1)\, </math> ([[Uniform distribution (continuous)|uniform distribution]]) then <math> -2\log{(U)} \sim \chi^2(2)\,</math>
 
* <math>\chi^2(6)\,</math> is a transformation of [[Laplace distribution]]
 
*If <math>X_i \sim \mathrm{Laplace}(\mu,\beta)\,</math> then <math>\sum_{i=1}^n{\frac{2 |X_i-\mu|}{\beta}} \sim \chi^2(2n)\,</math>
 
* chi-squared distribution is a transformation of [[Pareto distribution]]
 
* [[Student's t-distribution]] is a transformation of chi-squared distribution
 
* [[Student's t-distribution]] can be obtained from chi-squared distribution and [[normal distribution]]
 
* [[Noncentral beta distribution]] can be obtained as a transformation of chi-squared distribution and [[Noncentral chi-squared distribution]]
 
* [[Noncentral t-distribution]] can be obtained from normal distribution and chi-squared distribution
 
A chi-squared variable with ''k'' degrees of freedom is defined as the sum of the squares of ''k'' independent [[standard normal distribution|standard normal]] random variables.
 
If ''Y'' is a ''k''-dimensional Gaussian random vector with mean vector ''μ'' and rank ''k'' covariance matrix ''C'', then ''X''&nbsp;=&nbsp;(''Y''−''μ'')<sup>T</sup>''C''<sup>−1</sup>(''Y''−''μ'') is chi-squared distributed with ''k'' degrees of freedom.
 
The sum of squares of [[statistically independent]] unit-variance Gaussian variables which do ''not'' have mean zero yields a generalization of the chi-squared distribution called the [[noncentral chi-squared distribution]].
 
If ''Y'' is a vector of ''k'' [[i.i.d.]] standard normal random variables and ''A'' is a ''k×k'' [[idempotent matrix]] with [[rank (linear algebra)|rank]] ''k−n'' then the [[quadratic form]] ''Y<sup>T</sup>AY'' is chi-squared distributed with ''k−n'' degrees of freedom.
 
The chi-squared distribution is also naturally related to other distributions arising from the Gaussian. In particular,
 
* ''Y'' is [[F-distribution|F-distributed]], ''Y''&nbsp;~&nbsp;''F''(''k''<sub>1</sub>,''k''<sub>2</sub>) if <math>\scriptstyle Y = \frac{X_1 / k_1}{X_2 / k_2}</math> where ''X''<sub>1</sub>&nbsp;~&nbsp;''χ''²(''k''<sub>1</sub>) and ''X''<sub>2</sub> &nbsp;~&nbsp;''χ''²(''k''<sub>2</sub>) are statistically independent.
 
* If ''X'' is chi-squared distributed, then <math>\scriptstyle\sqrt{X}</math> is [[chi distribution|chi distributed]].
 
* If {{nowrap|''X''<sub>1</sub> &nbsp;~&nbsp; ''χ''<sup>2</sup><sub>''k''<sub>1</sub></sub>}} and {{nowrap|''X''<sub>2</sub> &nbsp;~&nbsp; ''χ''<sup>2</sup><sub>''k''<sub>2</sub></sub>}} are statistically independent, then {{nowrap|''X''<sub>1</sub> + ''X''<sub>2</sub> &nbsp;~&nbsp;''χ''<sup>2</sup><sub>''k''<sub>1</sub>+''k''<sub>2</sub></sub>}}. If ''X''<sub>1</sub> and ''X''<sub>2</sub> are not independent, then {{nowrap|''X''<sub>1</sub> + ''X''<sub>2</sub>}} is not chi-squared distributed.
 
==Generalizations==
The chi-squared distribution is obtained as the sum of the squares of ''k'' independent, zero-mean, unit-variance Gaussian random variables. Generalizations of this distribution can be obtained by summing the squares of other types of Gaussian random variables. Several such distributions are described below.
===Chi-squared distributions===
====Noncentral chi-squared distribution====
{{Main|Noncentral chi-squared distribution}}
The noncentral chi-squared distribution is obtained from the sum of the squares of independent Gaussian random variables having unit variance and ''nonzero'' means.
 
====Generalized chi-squared distribution====
{{Main|Generalized chi-squared distribution}}
The generalized chi-squared distribution is obtained from the quadratic form ''z′Az'' where ''z'' is a zero-mean Gaussian vector having an arbitrary covariance matrix, and ''A'' is an arbitrary matrix.
 
===Gamma, exponential, and related distributions===
The chi-squared distribution ''X''&nbsp;~&nbsp;''χ''²(''k'') is a special case of the [[gamma distribution]], in that ''X''&nbsp;~&nbsp;Γ(''k''/2,&nbsp;1/2) (using the shape parameterization of the gamma distribution) where ''k'' is an integer.
 
Because the [[exponential distribution]] is also a special case of the Gamma distribution, we also have that if ''X''&nbsp;~&nbsp;''χ''²(2), then ''X''&nbsp;~&nbsp;Exp(1/2) is an [[exponential distribution]].
 
The [[Erlang distribution]] is also a special case of the Gamma distribution and thus we also have that if ''X''&nbsp;~&nbsp;''χ''²(''k'') with even ''k'', then ''X'' is Erlang distributed with shape parameter ''k''/2 and scale parameter 1/2.
 
==Applications==
The chi-squared distribution has numerous applications in inferential [[statistics]], for instance in [[chi-squared test]]s and in estimating [[variance]]s. It enters the problem of estimating the mean of a normally distributed population and the problem of estimating the slope of a [[linear regression|regression]] line via its role in [[Student’s t-distribution]]. It enters all [[analysis of variance]] problems via its role in the [[F-distribution]], which is the distribution of the ratio of two independent chi-squared [[random variable]]s, each divided by their respective degrees of freedom.
 
Following are some of the most common situations in which the chi-squared distribution arises from a Gaussian-distributed sample.
 
*if ''X''<sub>1</sub>, ..., ''X<sub>n</sub>'' are [[independent identically-distributed random variables|i.i.d.]] ''N''(''μ'', ''σ''<sup>2</sup>) [[random variable]]s, then <math>\sum_{i=1}^n(X_i - \bar X)^2 \sim \sigma^2 \chi^2_{n-1}</math> where <math>\bar X = \frac{1}{n} \sum_{i=1}^n X_i</math>.
 
*The box below shows probability distributions with name starting with '''chi''' for some [[statistic]]s based on {{nowrap|''X<sub>i</sub>'' ∼ Normal(''μ<sub>i</sub>'', ''σ''<sup>2</sup><sub>''i''</sub>), ''i'' {{=}} 1, ⋯, ''k'', }} independent random variables:
<center>
{| class="wikitable" align="center"
|-
! Name !! Statistic
|-
| chi-squared distribution || <math>\sum_{i=1}^k \left(\frac{X_i-\mu_i}{\sigma_i}\right)^2</math>
|-
| [[noncentral chi-squared distribution]] || <math>\sum_{i=1}^k \left(\frac{X_i}{\sigma_i}\right)^2</math>
|-
| [[chi distribution]] || <math>\sqrt{\sum_{i=1}^k \left(\frac{X_i-\mu_i}{\sigma_i}\right)^2}</math>
|-
| [[noncentral chi distribution]] || <math>\sqrt{\sum_{i=1}^k \left(\frac{X_i}{\sigma_i}\right)^2}</math>
|}
</center>
 
==Table of ''χ''<sup>2</sup> value vs p-value==
The [[p-value]] is the probability of observing a test statistic ''at least'' as extreme in a chi-squared distribution.  Accordingly, since the [[cumulative distribution function]] (CDF) for the appropriate degrees of freedom ''(df)'' gives the probability of having obtained a value ''less extreme'' than this point, subtracting the CDF value from 1 gives the p-value.  The table below gives a number of p-values matching to ''χ''<sup>2</sup> for the first 10 degrees of freedom.
 
A p-value of 0.05 or less is usually regarded as [[Statistical significance|statistically significant]], i.e. the observed deviation from the null hypothesis is significant.
 
{| class="wikitable"
|-
! Degrees of freedom (df)
!colspan=11| ''χ''<sup>2</sup> value <ref>[http://www2.lv.psu.edu/jxm57/irp/chisquar.html Chi-Squared Test] Table B.2. Dr. Jacqueline S. McLaughlin at The Pennsylvania State University. In turn citing: R.A. Fisher and F. Yates, Statistical Tables for Biological Agricultural and Medical Research, 6th ed., Table IV</ref>
|-
| <div align="center"> 1
| 0.004
| 0.02
| 0.06
| 0.15
| 0.46
| 1.07
| 1.64
| 2.71
| 3.84
| 6.64
| 10.83
|-
| <div align="center"> 2
| 0.10
| 0.21
| 0.45
| 0.71
| 1.39
| 2.41
| 3.22
| 4.60
| 5.99
| 9.21
| 13.82
|-
| <div align="center"> 3
| 0.35
| 0.58
| 1.01
| 1.42
| 2.37
| 3.66
| 4.64
| 6.25
| 7.82
| 11.34
| 16.27
|-
| <div align="center"> 4
| 0.71
| 1.06
| 1.65
| 2.20
| 3.36
| 4.88
| 5.99
| 7.78
| 9.49
| 13.28
| 18.47
|-
| <div align="center"> 5
| 1.14
| 1.61
| 2.34
| 3.00
| 4.35
| 6.06
| 7.29
| 9.24
| 11.07
| 15.09
| 20.52
|-
| <div align="center"> 6
| 1.63
| 2.20
| 3.07
| 3.83
| 5.35
| 7.23
| 8.56
| 10.64
| 12.59
| 16.81
| 22.46
|-
| <div align="center"> 7
| 2.17
| 2.83
| 3.82
| 4.67
| 6.35
| 8.38
| 9.80
| 12.02
| 14.07
| 18.48
| 24.32
|-
| <div align="center"> 8
| 2.73
| 3.49
| 4.59
| 5.53
| 7.34
| 9.52
| 11.03
| 13.36
| 15.51
| 20.09
| 26.12
|-
| <div align="center"> 9
| 3.32
| 4.17
| 5.38
| 6.39
| 8.34
| 10.66
| 12.24
| 14.68
| 16.92
| 21.67
| 27.88
|-
| <div align="center"> 10
| 3.94
| 4.86
| 6.18
| 7.27
| 9.34
| 11.78
| 13.44
| 15.99
| 18.31
| 23.21
| 29.59
|-
! <div align="right"> P value (Probability)
| style="background: #ffa2aa" | 0.95
| style="background: #efaaaa" | 0.90
| style="background: #e8b2aa" | 0.80
| style="background: #dfbaaa" | 0.70
| style="background: #d8c2aa" | 0.50
| style="background: #cfcaaa" | 0.30
| style="background: #c8d2aa" | 0.20
| style="background: #bfdaaa" | 0.10
| style="background: #b8e2aa" | 0.05
| style="background: #afeaaa" | 0.01
| style="background: #a8faaa" | 0.001
|-
|
!colspan=8 style="background: #e8b2aa" | Nonsignificant
!colspan=3 style="background: #afeaaa" | Significant
|}
 
==History==
 
This distribution was first described by the German statistician Helmert.
 
==See also==
{{Portal|Statistics}}
{{Colbegin}}
*[[Cochran's theorem]]
*[[Fisher's method]] for combining [[Statistical independence|independent]] tests of significance
* [[Pearson's chi-squared test]]
* [[F-distribution]]
* [[Generalized chi-squared distribution]]
* [[Gamma distribution]]
* [[Hotelling's T-squared distribution]]
* [[Student's t-distribution]]
* [[Wilks' lambda distribution]]
* [[Wishart distribution]]
{{Colend}}
 
==References==
{{Reflist}}
 
==External links==
*[http://jeff560.tripod.com/c.html Earliest Uses of Some of the Words of Mathematics: entry on Chi squared has a brief history]
*[http://www.stat.yale.edu/Courses/1997-98/101/chigf.htm Course notes on Chi-Squared Goodness of Fit Testing] from Yale University Stats 101 class.
*[http://demonstrations.wolfram.com/StatisticsAssociatedWithNormalSamples/ ''Mathematica'' demonstration showing the chi-squared sampling distribution of various statistics, e.g. Σ''x''², for a normal population]
*[http://www.jstor.org/stable/2348373 Simple algorithm for approximating cdf and inverse cdf for the chi-squared distribution with a pocket calculator]
 
{{ProbDistributions|continuous-semi-infinite}}
{{Common univariate probability distributions}}
 
{{DEFAULTSORT:Chi-Squared Distribution}}
[[Category:Continuous distributions]]
[[Category:Normal distribution]]
 
[[Category:Exponential family distributions]]
 
[[ar:توزيع خي تربيع]]
[[ca:Distribució khi quadrat]]
[[cs:Χ² rozdělení]]
[[de:Chi-Quadrat-Verteilung]]
[[es:Distribución χ²]]
[[eu:Khi-karratu banakuntza]]
[[fa:توزیع کی‌دو]]
[[fr:Loi du χ²]]
[[ko:카이제곱 분포]]
[[id:Distribusi khi-kuadrat]]
[[is:Kí-kvaðratsdreifing]]
[[it:Distribuzione chi quadrato]]
[[he:התפלגות כי בריבוע]]
[[hu:Khi-négyzet eloszlás]]
[[nl:Chi-kwadraatverdeling]]
[[ja:カイ二乗分布]]
[[no:Kjikvadratfordeling]]
[[pl:Rozkład chi kwadrat]]
[[pt:Chi-quadrado]]
[[ru:Распределение хи-квадрат]]
[[simple:Chi-square distribution]]
[[sk:Χ²-rozdelenie]]
[[sl:Porazdelitev hi-kvadrat]]
[[su:Sebaran chi-kuadrat]]
[[fi:Khii toiseen -jakauma]]
[[sv:Chitvåfördelning]]
[[tr:Ki-kare dağılımı]]
[[uk:Розподіл хі-квадрат]]
[[zh:卡方分佈]]
[[zh-yue:Chi-square]]

Latest revision as of 23:52, 15 September 2019

This is a preview for the new MathML rendering mode (with SVG fallback), which is availble in production for registered users.

If you would like use the MathML rendering mode, you need a wikipedia user account that can be registered here [[1]]

  • Only registered users will be able to execute this rendering mode.
  • Note: you need not enter a email address (nor any other private information). Please do not use a password that you use elsewhere.

Registered users will be able to choose between the following three rendering modes:

MathML


Follow this link to change your Math rendering settings. You can also add a Custom CSS to force the MathML/SVG rendering or select different font families. See these examples.

Demos

Here are some demos:


Test pages

To test the MathML, PNG, and source rendering modes, please go to one of the following test pages:

Bug reporting

If you find any bugs, please report them at Bugzilla, or write an email to math_bugs (at) ckurs (dot) de .