Complex random vector

From testwiki
Revision as of 08:46, 22 November 2024 by imported>HGode (β†’Covariance matrices of real and imaginary parts: I realised that K_XY and K_YX where interchanged in the block covariance matrix according to the following equations and definitions. I compared this with the reference 1)
(diff) ← Older revision | Latest revision (diff) | Newer revision β†’ (diff)
Jump to navigation Jump to search

In probability theory and statistics, a complex random vector is typically a tuple of complex-valued random variables, and generally is a random variable taking values in a vector space over the field of complex numbers. If Z1,,Zn are complex-valued random variables, then the n-tuple (Z1,,Zn) is a complex random vector. Complex random variables can always be considered as pairs of real random vectors: their real and imaginary parts.

Some concepts of real random vectors have a straightforward generalization to complex random vectors. For example, the definition of the mean of a complex random vector. Other concepts are unique to complex random vectors.

Applications of complex random vectors are found in digital signal processing.

Template:Probability fundamentals

Definition

A complex random vector 𝐙=(Z1,,Zn)T on the probability space (Ω,β„±,P) is a function 𝐙:Ωβ„‚n such that the vector ((Z1),(Z1),,(Zn),(Zn))T is a real random vector on (Ω,β„±,P) where (z) denotes the real part of z and (z) denotes the imaginary part of z.[1]Template:Rp

Cumulative distribution function

The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form P(Z1+3i) make no sense. However expressions of the form P((Z)1,(Z)3) make sense. Therefore, the cumulative distribution function F𝐙:β„‚n[0,1] of a random vector 𝐙=(Z1,...,Zn)T is defined as

where 𝐳=(z1,...,zn)T.

Expectation

As in the real case the expectation (also called expected value) of a complex random vector is taken component-wise.[1]Template:Rp

Covariance matrix and pseudo-covariance matrix

Template:See also

The covariance matrix (also called second central moment) K𝐙𝐙 contains the covariances between all pairs of components. The covariance matrix of an n×1 random vector is an n×n matrix whose (i,j)th element is the covariance between the i th and the j th random variables.[2]Template:Rp Unlike in the case of real random variables, the covariance between two random variables involves the complex conjugate of one of the two. Thus the covariance matrix is a Hermitian matrix.[1]Template:Rp

K𝐙𝐙=[E[(Z1E[Z1])(Z1E[Z1])]E[(Z1E[Z1])(Z2E[Z2])]E[(Z1E[Z1])(ZnE[Zn])]E[(Z2E[Z2])(Z1E[Z1])]E[(Z2E[Z2])(Z2E[Z2])]E[(Z2E[Z2])(ZnE[Zn])]E[(ZnE[Zn])(Z1E[Z1])]E[(ZnE[Zn])(Z2E[Z2])]E[(ZnE[Zn])(ZnE[Zn])]]

The pseudo-covariance matrix (also called relation matrix) is defined replacing Hermitian transposition by transposition in the definition above.

J𝐙𝐙=[E[(Z1E[Z1])(Z1E[Z1])]E[(Z1E[Z1])(Z2E[Z2])]E[(Z1E[Z1])(ZnE[Zn])]E[(Z2E[Z2])(Z1E[Z1])]E[(Z2E[Z2])(Z2E[Z2])]E[(Z2E[Z2])(ZnE[Zn])]E[(ZnE[Zn])(Z1E[Z1])]E[(ZnE[Zn])(Z2E[Z2])]E[(ZnE[Zn])(ZnE[Zn])]]
Properties

The covariance matrix is a hermitian matrix, i.e.[1]Template:Rp

K𝐙𝐙H=K𝐙𝐙.

The pseudo-covariance matrix is a symmetric matrix, i.e.

J𝐙𝐙T=J𝐙𝐙.

The covariance matrix is a positive semidefinite matrix, i.e.

𝐚HKπ™π™πš0for all πšβ„‚n.

Covariance matrices of real and imaginary parts

Template:See also

By decomposing the random vector 𝐙 into its real part 𝐗=(𝐙) and imaginary part 𝐘=(𝐙) (i.e. 𝐙=𝐗+i𝐘), the pair (𝐗,𝐘) has a covariance matrix of the form:

[K𝐗𝐗Kπ—π˜Kπ˜π—K𝐘𝐘]

The matrices K𝐙𝐙 and J𝐙𝐙 can be related to the covariance matrices of 𝐗 and 𝐘 via the following expressions:

K𝐗𝐗=E[(𝐗E[𝐗])(𝐗E[𝐗])T]=12Re(K𝐙𝐙+J𝐙𝐙)K𝐘𝐘=E[(𝐘E[𝐘])(𝐘E[𝐘])T]=12Re(K𝐙𝐙J𝐙𝐙)Kπ˜π—=E[(𝐘E[𝐘])(𝐗E[𝐗])T]=12Im(J𝐙𝐙+K𝐙𝐙)Kπ—π˜=E[(𝐗E[𝐗])(𝐘E[𝐘])T]=12Im(J𝐙𝐙K𝐙𝐙)

Conversely:

K𝐙𝐙=K𝐗𝐗+K𝐘𝐘+i(Kπ˜π—Kπ—π˜)J𝐙𝐙=K𝐗𝐗K𝐘𝐘+i(Kπ˜π—+Kπ—π˜)

Cross-covariance matrix and pseudo-cross-covariance matrix

The cross-covariance matrix between two complex random vectors 𝐙,𝐖 is defined as:

K𝐙𝐖=[E[(Z1E[Z1])(W1E[W1])]E[(Z1E[Z1])(W2E[W2])]E[(Z1E[Z1])(WnE[Wn])]E[(Z2E[Z2])(W1E[W1])]E[(Z2E[Z2])(W2E[W2])]E[(Z2E[Z2])(WnE[Wn])]E[(ZnE[Zn])(W1E[W1])]E[(ZnE[Zn])(W2E[W2])]E[(ZnE[Zn])(WnE[Wn])]]

And the pseudo-cross-covariance matrix is defined as:

J𝐙𝐖=[E[(Z1E[Z1])(W1E[W1])]E[(Z1E[Z1])(W2E[W2])]E[(Z1E[Z1])(WnE[Wn])]E[(Z2E[Z2])(W1E[W1])]E[(Z2E[Z2])(W2E[W2])]E[(Z2E[Z2])(WnE[Wn])]E[(ZnE[Zn])(W1E[W1])]E[(ZnE[Zn])(W2E[W2])]E[(ZnE[Zn])(WnE[Wn])]]

Two complex random vectors 𝐙 and 𝐖 are called uncorrelated if

K𝐙𝐖=J𝐙𝐖=0.

Independence

Template:Main Two complex random vectors 𝐙=(Z1,...,Zm)T and 𝐖=(W1,...,Wn)T are called independent if

where F𝐙(𝐳) and F𝐖(𝐰) denote the cumulative distribution functions of 𝐙 and 𝐖 as defined in Template:EquationNote and F𝐙,𝐖(𝐳,𝐰) denotes their joint cumulative distribution function. Independence of 𝐙 and 𝐖 is often denoted by 𝐙𝐖. Written component-wise, 𝐙 and 𝐖 are called independent if

FZ1,,Zm,W1,,Wn(z1,,zm,w1,,wn)=FZ1,,Zm(z1,,zm)FW1,,Wn(w1,,wn)for all z1,,zm,w1,,wn.

Circular symmetry

A complex random vector 𝐙 is called circularly symmetric if for every deterministic φ[π,π) the distribution of eiφ𝐙 equals the distribution of 𝐙.[3]Template:Rp

Properties
  • The expectation of a circularly symmetric complex random vector is either zero or it is not defined.[3]Template:Rp
  • The pseudo-covariance matrix of a circularly symmetric complex random vector is zero.[3]Template:Rp

Proper complex random vectors

A complex random vector 𝐙 is called proper if the following three conditions are all satisfied:[1]Template:Rp

  • E[𝐙]=0 (zero mean)
  • var[Z1]<,,var[Zn]< (all components have finite variance)
  • E[𝐙𝐙T]=0

Two complex random vectors 𝐙,𝐖 are called jointly proper is the composite random vector (Z1,Z2,,Zm,W1,W2,,Wn)T is proper.

Properties
  • A complex random vector 𝐙 is proper if, and only if, for all (deterministic) vectors πœβ„‚n the complex random variable 𝐜T𝐙 is proper.[1]Template:Rp
  • Linear transformations of proper complex random vectors are proper, i.e. if 𝐙 is a proper random vectors with n components and A is a deterministic m×n matrix, then the complex random vector A𝐙 is also proper.[1]Template:Rp
  • Every circularly symmetric complex random vector with finite variance of all its components is proper.[1]Template:Rp
  • There are proper complex random vectors that are not circularly symmetric.[1]Template:Rp
  • A real random vector is proper if and only if it is constant.
  • Two jointly proper complex random vectors are uncorrelated if and only if their covariance matrix is zero, i.e. if K𝐙𝐖=0.

Cauchy-Schwarz inequality

The Cauchy-Schwarz inequality for complex random vectors is

|E[𝐙H𝐖]|2E[𝐙H𝐙]E[|𝐖H𝐖|].

Characteristic function

The characteristic function of a complex random vector 𝐙 with n components is a function β„‚nβ„‚ defined by:[1]Template:Rp

φ𝐙(ω)=E[ei(ωH𝐙)]=E[ei((ω1)(Z1)+(ω1)(Z1)++(ωn)(Zn)+(ωn)(Zn))]

See also

References

Template:Reflist