Noncentral chi-squared distribution

From testwiki
Revision as of 11:26, 10 January 2025 by 134.60.67.135 (talk) (Related distributions)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Template:Short description Template:Probability distribution

In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral χ2 distribution) is a noncentral generalization of the chi-squared distribution. It often arises in the power analysis of statistical tests in which the null distribution is (perhaps asymptotically) a chi-squared distribution; important examples of such tests are the likelihood-ratio tests.[1]

Definitions

Background

Let (X1,X2,,Xi,,Xk) be k independent, normally distributed random variables with means μi and unit variances. Then the random variable

i=1kXi2

is distributed according to the noncentral chi-squared distribution. It has two parameters: k which specifies the number of degrees of freedom (i.e. the number of Xi), and λ which is related to the mean of the random variables Xi by:

λ=i=1kμi2.

λ is sometimes called the noncentrality parameter. Note that some references define λ in other ways, such as half of the above sum, or its square root.

This distribution arises in multivariate statistics as a derivative of the multivariate normal distribution. While the central chi-squared distribution is the squared norm of a random vector with N(0k,Ik) distribution (i.e., the squared distance from the origin to a point taken at random from that distribution), the non-central χ2 is the squared norm of a random vector with N(μ,Ik) distribution. Here 0k is a zero vector of length k, μ=(μ1,,μk) and Ik is the identity matrix of size k.

Density

The probability density function (pdf) is given by

fX(x;k,λ)=i=0eλ/2(λ/2)ii!fYk+2i(x),

where Yq is distributed as chi-squared with q degrees of freedom.

From this representation, the noncentral chi-squared distribution is seen to be a Poisson-weighted mixture of central chi-squared distributions. Suppose that a random variable J has a Poisson distribution with mean λ/2, and the conditional distribution of Z given J = i is chi-squared with k + 2i degrees of freedom. Then the unconditional distribution of Z is non-central chi-squared with k degrees of freedom, and non-centrality parameter λ.

Alternatively, the pdf can be written as

fX(x;k,λ)=12e(x+λ)/2(xλ)k/41/2Ik/21(λx)

where Iν(y) is a modified Bessel function of the first kind given by

Iν(y)=(y/2)νj=0(y2/4)jj!Γ(ν+j+1).

Using the relation between Bessel functions and hypergeometric functions, the pdf can also be written as:[2]

fX(x;k,λ)=eλ/20F1(;k/2;λx/4)12k/2Γ(k/2)ex/2xk/21.

The case k = 0 (zero degrees of freedom), in which case the distribution has a discrete component at zero, is discussed by Torgersen (1972) and further by Siegel (1979).[3][4]

Derivation of the pdf

The derivation of the probability density function is most easily done by performing the following steps:

  1. Since X1,,Xk have unit variances, their joint distribution is spherically symmetric, up to a location shift.
  2. The spherical symmetry then implies that the distribution of X=X12++Xk2 depends on the means only through the squared length, λ=μ12++μk2. Without loss of generality, we can therefore take μ1=λ and μ2==μk=0.
  3. Now derive the density of X=X12 (i.e. the k = 1 case). Simple transformation of random variables shows that
fX(x,1,λ)=12x(ϕ(xλ)+ϕ(x+λ))=12πxe(x+λ)/2cosh(λx),
where ϕ() is the standard normal density.
  1. Expand the cosh term in a Taylor series. This gives the Poisson-weighted mixture representation of the density, still for k = 1. The indices on the chi-squared random variables in the series above are 1 + 2i in this case.
  2. Finally, for the general case. We've assumed, without loss of generality, that X2,,Xk are standard normal, and so X22++Xk2 has a central chi-squared distribution with (k − 1) degrees of freedom, independent of X12. Using the poisson-weighted mixture representation for X12, and the fact that the sum of chi-squared random variables is also a chi-square, completes the result. The indices in the series are (1 + 2i) + (k − 1) = k + 2i as required.

Properties

Moment generating function

The moment-generating function is given by

M(t;k,λ)=exp(λt12t)(12t)k/2.

Moments

The first few raw moments are:

μ'1=k+λ
μ'2=(k+λ)2+2(k+2λ)
μ'3=(k+λ)3+6(k+λ)(k+2λ)+8(k+3λ)
μ'4=(k+λ)4+12(k+λ)2(k+2λ)+4(11k2+44kλ+36λ2)+48(k+4λ).

The first few central moments are:

μ2=2(k+2λ)
μ3=8(k+3λ)
μ4=12(k+2λ)2+48(k+4λ)

The nth cumulant is

κn=2n1(n1)!(k+nλ).

Hence

μ'n=2n1(n1)!(k+nλ)+j=1n1(n1)!2j1(nj)!(k+jλ)μ'nj.

Cumulative distribution function

Again using the relation between the central and noncentral chi-squared distributions, the cumulative distribution function (cdf) can be written as

P(x;k,λ)=eλ/2j=0(λ/2)jj!Q(x;k+2j)

where Q(x;k) is the cumulative distribution function of the central chi-squared distribution with k degrees of freedom which is given by

Q(x;k)=γ(k/2,x/2)Γ(k/2)
and where γ(k,z) is the lower incomplete gamma function.

The Marcum Q-function QM(a,b) can also be used to represent the cdf.[5]

P(x;k,λ)=1Qk2(λ,x)

When the degrees of freedom k is positive odd integer, we have a closed form expression for the complementary cumulative distribution function given by[6]

P(x;2n+1,λ)=1Qn+1/2(λ,x)=1[Q(xλ)+Q(x+λ)+e(x+λ)/2m=1n(xλ)m/21/4Im1/2(λx)],

where n is non-negative integer, Q is the Gaussian Q-function, and I is the modified Bessel function of first kind with half-integer order. The modified Bessel function of first kind with half-integer order in itself can be represented as a finite sum in terms of hyperbolic functions.

In particular, for k = 1, we have

P(x;1,λ)=1[Q(xλ)+Q(x+λ)].

Also, for k = 3, we have

P(x;3,λ)=1[Q(xλ)+Q(x+λ)+2πsinh(λx)λe(x+λ)/2].

Approximation (including for quantiles)

Abdel-Aty derives (as "first approx.") a non-central Wilson–Hilferty transformation:[7]

(χ'2k+λ)13 is approximately normally distributed, 𝒩(129f,29f), i.e.,

P(x;k,λ)Φ{(xk+λ)1/3(129f)29f},where  f:=(k+λ)2k+2λ=k+λ2k+2λ,

which is quite accurate and well adapting to the noncentrality. Also, f=f(k,λ) becomes f=k for λ=0, the (central) chi-squared case.

Sankaran discusses a number of closed form approximations for the cumulative distribution function.[8] In an earlier paper, he derived and states the following approximation:[9]

P(x;k,λ)Φ{(xk+λ)h(1+hp(h10.5(2h)mp))h2p(1+0.5mp)}

where

Φ{} denotes the cumulative distribution function of the standard normal distribution;
h=123(k+λ)(k+3λ)(k+2λ)2;
p=k+2λ(k+λ)2;
m=(h1)(13h).

This and other approximations are discussed in a later text book.[10]

More recently, since the CDF of non-central chi-squared distribution with odd degree of freedom can be exactly computed, the CDF for even degree of freedom can be approximated by exploiting the monotonicity and log-concavity properties of Marcum-Q function as

P(x;2n,λ)12[P(x;2n1,λ)+P(x;2n+1,λ)].

Another approximation that also serves as an upper bound is given by

P(x;2n,λ)1[(1P(x;2n1,λ))(1P(x;2n+1,λ))]1/2.

For a given probability, these formulas are easily inverted to provide the corresponding approximation for x, to compute approximate quantiles.

  • If V is chi-square distributed, Vχk2, then V is also non-central chi-square distributed: Vχk2(0)
  • A linear combination of independent noncentral chi-squared variables ξ=iλiYi+c,Yiχ'2(mi,δi2), is generalized chi-square distributed.
  • If V1χk12(λ) and V2χk22(0) and V1 is independent of V2 then a noncentral F-distributed variable is developed as V1/k1V2/k2F'k1,k2(λ)
  • If JPoisson(12λ), then χk+2J2χk2(λ)
  • If Vχ22(λ), then V takes the Rice distribution with parameter λ.
  • Normal approximation:[11] if Vχk2(λ), then V(k+λ)2(k+2λ)N(0,1) in distribution as either k or λ.
  • If V1χk12(λ1)and V2χk22(λ2), where V1,V2 are independent, then W=(V1+V2)χk1+k22(λ1+λ2).
  • In general, for an independent finite set of Viχki2(λi), i{1,,N}, the sum of these non-central chi-square distributed random variables Y=i=1NVi has the distribution Yχky2(λy) where ky=i=1Nki, λy=i=1Nλi. This can be seen using moment generating functions as follows: MY(t)=Mi=1NVi(t)=i=1NMVi(t) by the independence of the Vi random variables. It remains to plug in the MGF for the non-central chi square distributions into the product and compute the new MGF – this is left as an exercise. Alternatively it can be seen via the interpretation in the background section above as sums of squares of independent normally distributed random variables with variances of 1 and the specified means.
  • The complex noncentral chi-squared distribution has applications in radio communication and radar systems.Template:Citation needed Let z1,,zk be independent scalar complex random variables with noncentral circular symmetry, means of μi and unit variances: E|ziμi|2=1. Then the real random variable S=i=1k|zi|2 is distributed according to the complex noncentral chi-squared distribution, which is effectively a scaled (by 1/2) non-central χ2 with twice the degrees of freedom and twice the noncentrality parameter:
fS(S)=(Sλ)(k1)/2e(S+λ)Ik1(2Sλ),
where λ=i=1k|μi|2.

Transformations

Sankaran (1963) discusses the transformations of the form z=[(Xb)/(k+λ)]1/2. He analyzes the expansions of the cumulants of z up to the term O((k+λ)4) and shows that the following choices of b produce reasonable results:

  • b=(k1)/2 makes the second cumulant of z approximately independent of λ
  • b=(k1)/3 makes the third cumulant of z approximately independent of λ
  • b=(k1)/4 makes the fourth cumulant of z approximately independent of λ

Also, a simpler transformation z1=(X(k1)/2)1/2 can be used as a variance stabilizing transformation that produces a random variable with mean (λ+(k1)/2)1/2 and variance O((k+λ)2).

Usability of these transformations may be hampered by the need to take the square roots of negative numbers.

Various chi and chi-squared distributions
Name Statistic
chi-squared distribution i=1k(Xiμiσi)2
noncentral chi-squared distribution i=1k(Xiσi)2
chi distribution i=1k(Xiμiσi)2
noncentral chi distribution i=1k(Xiσi)2

Occurrence and applications

Use in tolerance intervals

Two-sided normal regression tolerance intervals can be obtained based on the noncentral chi-squared distribution.[12] This enables the calculation of a statistical interval within which, with some confidence level, a specified proportion of a sampled population falls.

Notes

  1. Template:Cite journal
  2. Muirhead (2005) Theorem 1.3.4
  3. Torgersen, E. N. (1972), "Supplementary notes on linear models", Preprint series: Statistical Memoirs, Dept. of Mathematics, University of Oslo, http://urn.nb.no/URN:NBN:no-58681
  4. Siegel, A. F. (1979), "The noncentral chi-squared distribution with zero degrees of freedom and testing for uniformity", Biometrika, 66, 381–386
  5. Nuttall, Albert H. (1975): Some Integrals Involving the QM Function, IEEE Transactions on Information Theory, 21(1), 95–96, Template:ISSN
  6. A. Annamalai, C. Tellambura and John Matyjas (2009). "A New Twist on the Generalized Marcum Q-Function QM(ab) with Fractional-Order M and its Applications". 2009 6th IEEE Consumer Communications and Networking Conference, 1–5, Template:ISBN
  7. Template:Cite journal
  8. Template:Cite journal
  9. Template:Cite journal
  10. Johnson et al. (1995) Continuous Univariate Distributions Section 29.8
  11. Muirhead (2005) pages 22–24 and problem 1.18.
  12. Template:Cite journal, p. 32

References

Template:ProbDistributions