Slutsky's theorem
Template:Short description In probability theory, Slutsky's theorem extends some properties of algebraic operations on convergent sequences of real numbers to sequences of random variables.[1]
The theorem was named after Eugen Slutsky.[2] Slutsky's theorem is also attributed to Harald Cramér.[3]
Statement
Let be sequences of scalar/vector/matrix random elements. If converges in distribution to a random element and converges in probability to a constant , then
- provided that c is invertible,
where denotes convergence in distribution.
Notes:
- The requirement that Yn converges to a constant is important — if it were to converge to a non-degenerate random variable, the theorem would be no longer valid. For example, let and . The sum for all values of n. Moreover, , but does not converge in distribution to , where , , and and are independent.[4]
- The theorem remains valid if we replace all convergences in distribution with convergences in probability.
Proof
This theorem follows from the fact that if Xn converges in distribution to X and Yn converges in probability to a constant c, then the joint vector (Xn, Yn) converges in distribution to (X, c) (see here).
Next we apply the continuous mapping theorem, recognizing the functions g(x,y) = x + y, g(x,y) = xy, and g(x,y) = x y−1 are continuous (for the last function to be continuous, y has to be invertible).
See also
References
Further reading
- ↑ Template:Cite book
- ↑ Template:Cite journal
- ↑ Slutsky's theorem is also called Cramér's theorem according to Remark 11.1 (page 249) of Template:Cite book
- ↑ See Template:Cite web