Information matrix test

From testwiki
Jump to navigation Jump to search

In econometrics, the information matrix test is used to determine whether a regression model is misspecified. The test was developed by Halbert White,[1] who observed that in a correctly specified model and under standard regularity assumptions, the Fisher information matrix can be expressed in either of two ways: as the outer product of the gradient, or as a function of the Hessian matrix of the log-likelihood function.

Consider a linear model ๐ฒ=๐—๐œท+๐ฎ, where the errors ๐ฎ are assumed to be distributed N(0,ฯƒ2๐ˆ). If the parameters ฮฒ and ฯƒ2 are stacked in the vector ๐œฝ๐–ณ=[ฮฒฯƒ2], the resulting log-likelihood function is

โ„“(๐œฝ)=โˆ’n2logฯƒ2โˆ’12ฯƒ2(๐ฒโˆ’๐—๐œท)๐–ณ(๐ฒโˆ’๐—๐œท)

The information matrix can then be expressed as

๐ˆ(๐œฝ)=E[(โˆ‚โ„“(๐œฝ)โˆ‚๐œฝ)(โˆ‚โ„“(๐œฝ)โˆ‚๐œฝ)๐–ณ]

that is the expected value of the outer product of the gradient or score. Second, it can be written as the negative of the Hessian matrix of the log-likelihood function

๐ˆ(๐œฝ)=โˆ’E[โˆ‚2โ„“(๐œฝ)โˆ‚๐œฝโˆ‚๐œฝ๐–ณ]

If the model is correctly specified, both expressions should be equal. Combining the equivalent forms yields

๐œŸ(๐œฝ)=โˆ‘i=1n[โˆ‚2โ„“(๐œฝ)โˆ‚๐œฝโˆ‚๐œฝ๐–ณ+โˆ‚โ„“(๐œฝ)โˆ‚๐œฝโˆ‚โ„“(๐œฝ)โˆ‚๐œฝ]

where ๐œŸ(๐œฝ) is an (rร—r) random matrix, where r is the number of parameters. White showed that the elements of nโˆ’1/2๐œŸ(๐œฝ^), where ๐œฝ^ is the MLE, are asymptotically normally distributed with zero means when the model is correctly specified.[2] In small samples, however, the test generally performs poorly.[3]

References

Template:Reflist

Further reading