q-Gaussian distribution

From testwiki
Jump to navigation Jump to search

Template:Short description Template:About

Template:Probability distribution

The q-Gaussian is a probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints. It is one example of a Tsallis distribution. The q-Gaussian is a generalization of the Gaussian in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy.[1] The normal distribution is recovered as q → 1.

The q-Gaussian has been applied to problems in the fields of statistical mechanics, geology, anatomy, astronomy, economics, finance, and machine learning.Template:Cn The distribution is often favored for its heavy tails in comparison to the Gaussian for 1 < q < 3. For q<1 the q-Gaussian distribution is the PDF of a bounded random variable. This makes in biology and other domains[2] the q-Gaussian distribution more suitable than Gaussian distribution to model the effect of external stochasticity. A generalized q-analog of the classical central limit theorem[3] was proposed in 2008, in which the independence constraint for the i.i.d. variables is relaxed to an extent defined by the q parameter, with independence being recovered as q → 1. However, a proof of such a theorem is still lacking.[4]

In the heavy tail regions, the distribution is equivalent to the Student's t-distribution with a direct mapping between q and the degrees of freedom. A practitioner using one of these distributions can therefore parameterize the same distribution in two different ways. The choice of the q-Gaussian form may arise if the system is non-extensive, or if there is lack of a connection to small samples sizes.

Characterization

Probability density function

The standard q-Gaussian has the probability density function [3]

f(x)=βCqeq(βx2)

where

eq(x)=[1+(1q)x]+11q

is the q-exponential and the normalization factor Cq is given by

Cq=2πΓ(11q)(3q)1qΓ(3q2(1q)) for <q<1
Cq=π for q=1
Cq=πΓ(3q2(q1))q1Γ(1q1) for 1<q<3.

Note that for q<1 the q-Gaussian distribution is the PDF of a bounded random variable.

Cumulative density function

For 1<q<3 cumulative density function is [5]

F(x)=12+q1Γ(1q1)xβ2F1(12,1q1;32;(q1)βx2)πΓ(3q2(q1)),

where 2F1(a,b;c;z) is the hypergeometric function. As the hypergeometric function is defined for Template:Math but x is unbounded, Pfaff transformation could be used.

For q<1, F(x)={0x<1β(1q),12+1qΓ(53q2(1q))xβ2F1(12,1q1;32;(q1)βx2)πΓ(2q1q)1β(1q)<x<1β(1q),1x>1β(1q).

Entropy

Just as the normal distribution is the maximum information entropy distribution for fixed values of the first moment E(X) and second moment E(X2) (with the fixed zeroth moment E(X0)=1 corresponding to the normalization condition), the q-Gaussian distribution is the maximum Tsallis entropy distribution for fixed values of these three moments.

Student's t-distribution

While it can be justified by an interesting alternative form of entropy, statistically it is a scaled reparametrization of the Student's t-distribution introduced by W. Gosset in 1908 to describe small-sample statistics. In Gosset's original presentation the degrees of freedom parameter ν was constrained to be a positive integer related to the sample size, but it is readily observed that Gosset's density function is valid for all real values of ν.Template:Citation needed The scaled reparametrization introduces the alternative parameters q and β which are related to ν.

Given a Student's t-distribution with ν degrees of freedom, the equivalent q-Gaussian has

q=ν+3ν+1 with β=13q

with inverse

ν=3qq1, but only if β=13q.

Whenever β13q, the function is simply a scaled version of Student's t-distribution.

It is sometimes argued that the distribution is a generalization of Student's t-distribution to negative and or non-integer degrees of freedom. However, the theory of Student's t-distribution extends trivially to all real degrees of freedom, where the support of the distribution is now compact rather than infinite in the case of ν < 0.Template:Citation needed

Three-parameter version

As with many distributions centered on zero, the q-Gaussian can be trivially extended to include a location parameter μ. The density then becomes defined by

βCqeq(β(xμ)2).

Generating random deviates

The Box–Muller transform has been generalized to allow random sampling from q-Gaussians.[6] The standard Box–Muller technique generates pairs of independent normally distributed variables from equations of the following form.

Z1=2ln(U1)cos(2πU2)
Z2=2ln(U1)sin(2πU2)

The generalized Box–Muller technique can generates pairs of q-Gaussian deviates that are not independent. In practice, only a single deviate will be generated from a pair of uniformly distributed variables. The following formula will generate deviates from a q-Gaussian with specified parameter q and β=13q

Z=2 lnq(U1) cos(2πU2)

where  lnq is the q-logarithm and q=1+q3q

These deviates can be transformed to generate deviates from an arbitrary q-Gaussian by

Z=μ+Zβ(3q)

Applications

Physics

It has been shown that the momentum distribution of cold atoms in dissipative optical lattices is a q-Gaussian.[7]

The q-Gaussian distribution is also obtained as the asymptotic probability density function of the position of the unidimensional motion of a mass subject to two forces: a deterministic force of the type F1(x)=2x/(1x2) (determining an infinite potential well) and a stochastic white noise force F2(t)=2(1q)ξ(t), where ξ(t) is a white noise. Note that in the overdamped/small mass approximation the above-mentioned convergence fails for q<0, as recently shown.[8]

Finance

Financial return distributions in the New York Stock Exchange, NASDAQ and elsewhere have been interpreted as q-Gaussians.[9][10]

See also

Notes

Template:Reflist

Further reading

  • Juniper, J. (2007) Template:Cite web, Centre of Full Employment and Equity, The University of Newcastle, Australia

Template:Tsallis Template:ProbDistributions

  1. Tsallis, C. Nonadditive entropy and nonextensive statistical mechanics-an overview after 20 years. Braz. J. Phys. 2009, 39, 337–356
  2. d'Onofrio A. (ed.) Bounded Noises in Physics, Biology, and Engineering. Birkhauser (2013)
  3. 3.0 3.1 Template:Cite journal
  4. Template:Citation
  5. Template:Cite web
  6. W. Thistleton, J.A. Marsh, K. Nelson and C. Tsallis, Generalized Box–Muller method for generating q-Gaussian random deviates, IEEE Transactions on Information Theory 53, 4805 (2007)
  7. Template:Cite journal
  8. Template:Cite journal
  9. Template:Cite journal
  10. L. Borland, The pricing of stock options, in Nonextensive Entropy – Interdisciplinary Applications, eds. M. Gell-Mann and C. Tsallis (Oxford University Press, New York, 2004)