Quantum Fisher information

From testwiki
Jump to navigation Jump to search

Template:Short description The quantum Fisher information is a central quantity in quantum metrology and is the quantum analogue of the classical Fisher information.[1][2][3][4][5] It is one of the central quantities used to qualify the utility of an input state, especially in Mach–Zehnder (or, equivalently, Ramsey) interferometer-based phase or parameter estimation.[1][3][6] It is shown that the quantum Fisher information can also be a sensitive probe of a quantum phase transition (e.g. recognizing the superradiant quantum phase transition in the Dicke model[6]). The quantum Fisher information FQ[ϱ,A] of a state ϱ with respect to the observable A is defined as

FQ[ϱ,A]=2k,l(λkλl)2(λk+λl)|k|A|l|2,

where λk and |k are the eigenvalues and eigenvectors of the density matrix ϱ, respectively, and the summation goes over all k and l such that λk+λl>0.

When the observable generates a unitary transformation of the system with a parameter θ from initial state ϱ0,

ϱ(θ)=exp(iAθ)ϱ0exp(+iAθ),

the quantum Fisher information constrains the achievable precision in statistical estimation of the parameter θ via the quantum Cramér–Rao bound as

(Δθ)21mFQ[ϱ,A],

where m is the number of independent repetitions.

It is often desirable to estimate the magnitude of an unknown parameter α that controls the strength of a system's Hamiltonian H=αA with respect to a known observable A during a known dynamical time t. In this case, defining θ=αt, so that θA=tH, means estimates of θ can be directly translated into estimates of α.

Connection with Fisher information

Classical Fisher information of measuring observable B on density matrix ϱ(θ) is defined as F[B,θ]=b1p(b|θ)(p(b|θ)θ)2, where p(b|θ)=b|ϱ(θ)|b is the probability of obtaining outcome b when measuring observable B on the transformed density matrix ϱ(θ). b is the eigenvalue corresponding to eigenvector |b of observable B.

Quantum Fisher information is the supremum of the classical Fisher information over all such observables,[7]

FQ[ϱ,A]=supBF[B,θ].

Relation to the symmetric logarithmic derivative

The quantum Fisher information equals the expectation value of Lϱ2, where Lϱ is the symmetric logarithmic derivative

Equivalent expressions

For a unitary encoding operation ϱ(θ)=exp(iAθ)ϱ0exp(+iAθ),, the quantum Fisher information can be computed as an integral,[8]

FQ[ϱ,A]=20tr(exp(ρ0t)[ϱ0,A]exp(ρ0t)[ϱ0,A]) dt,

where [ , ] on the right hand side denotes commutator. It can be also expressed in terms of Kronecker product and vectorization,[9]

FQ[ϱ,A]=2vec([ϱ0,A])(ρ0*I+Iρ0)1vec([ϱ0,A]),

where * denotes complex conjugate, and denotes conjugate transpose. This formula holds for invertible density matrices. For non-invertible density matrices, the inverse above is substituted by the Moore-Penrose pseudoinverse. Alternatively, one can compute the quantum Fisher information for invertible state ρν=(1ν)ρ0+νπ, where π is any full-rank density matrix, and then perform the limit ν0+ to obtain the quantum Fisher information for ρ0. Density matrix π can be, for example, Identity/dim in a finite-dimensional system, or a thermal state in infinite dimensional systems.

Generalization and relations to Bures metric and quantum fidelity

For any differentiable parametrization of the density matrix ϱ(θ) by a vector of parameters θ=(θ1,,θn), the quantum Fisher information matrix is defined as

FQij[ϱ(θ)]=2k,lRe(k|iϱ|ll|jϱ|k)λk+λl,

where i denotes partial derivative with respect to parameter θi. The formula also holds without taking the real part Re, because the imaginary part leads to an antisymmetric contribution that disappears under the sum. Note that all eigenvalues λk and eigenvectors |k of the density matrix potentially depend on the vector of parameters θ.

This definition is identical to four times the Bures metric, up to singular points where the rank of the density matrix changes (those are the points at which λk+λl suddenly becomes zero.) Through this relation, it also connects with quantum fidelity F(ϱ,σ)=(tr[ϱσϱ])2 of two infinitesimally close states,[10]

F(ϱθ,ϱθ+dθ)=114i,j(FQij[ϱ(θ)]+2λk(θ)=0ijλk)dθidθj+𝒪(dθ3),

where the inner sum goes over all k at which eigenvalues λk(θ)=0. The extra term (which is however zero in most applications) can be avoided by taking a symmetric expansion of fidelity,[11]

F(ϱθdθ/2,ϱθ+dθ/2)=114i,jFQij[ϱ(θ)]dθidθj+𝒪(dθ3).

For n=1 and unitary encoding, the quantum Fisher information matrix reduces to the original definition.

Quantum Fisher information matrix is a part of a wider family of quantum statistical distances.[12]

Relation to fidelity susceptibility

Assuming that |ψ0(θ) is a ground state of a parameter-dependent non-degenerate Hamiltonian H(θ), four times the quantum Fisher information of this state is called fidelity susceptibility, and denoted[13]

χF=4FQ(|ψ0(θ)).

Fidelity susceptibility measures the sensitivity of the ground state to the parameter, and its divergence indicates a quantum phase transition. This is because of the aforementioned connection with fidelity: a diverging quantum Fisher information means that |ψ0(θ) and |ψ0(θ+dθ) are orthogonal to each other, for any infinitesimal change in parameter dθ, and thus are said to undergo a phase-transition at point θ.

Convexity properties

The quantum Fisher information equals four times the variance for pure states

FQ[|Ψ,H]=4(ΔH)Ψ2.

For mixed states, when the probabilities are parameter independent, i.e., when p(θ)=p, the quantum Fisher information is convex:

FQ[pϱ1(θ)+(1p)ϱ2(θ),H]pFQ[ϱ1(θ),H]+(1p)FQ[ϱ2(θ),H].

The quantum Fisher information is the largest function that is convex and that equals four times the variance for pure states. That is, it equals four times the convex roof of the variance[14][15]

FQ[ϱ,H]=4inf{pk,|Ψk}kpk(ΔH)Ψk2,

where the infimum is over all decompositions of the density matrix

ϱ=kpk|ΨkΨk|.

Note that |Ψk are not necessarily orthogonal to each other. The above optimization can be rewritten as an optimization over the two-copy space as [16]

FQ[ϱ,H]=minϱ122Tr[(HIdentityIdentityH)2ϱ12],

such that ϱ12 is a symmetric separable state and

Tr1(ϱ12)=Tr2(ϱ12)=ϱ.

Later the above statement has been proved even for the case of a minimization over general (not necessarily symmetric) separable states.[17]

When the probabilities are θ-dependent, an extended-convexity relation has been proved:[18]

FQ[ipi(θ)ϱi(θ)]ipi(θ)FQ[ϱi(θ)]+FC[{pi(θ)}],

where FC[{pi(θ)}]=iθpi(θ)2pi(θ) is the classical Fisher information associated to the probabilities contributing to the convex decomposition. The first term, in the right hand side of the above inequality, can be considered as the average quantum Fisher information of the density matrices in the convex decomposition.

Inequalities for composite systems

We need to understand the behavior of quantum Fisher information in composite system in order to study quantum metrology of many-particle systems.[19] For product states,

FQ[ϱ1ϱ2,H1Identity+IdentityH2]=FQ[ϱ1,H1]+FQ[ϱ2,H2]

holds.

For the reduced state, we have

FQ[ϱ12,H1Identity2]FQ[ϱ1,H1],

where ϱ1=Tr2(ϱ12).

Relation to entanglement

There are strong links between quantum metrology and quantum information science. For a multiparticle system of N spin-1/2 particles [20]

FQ[ϱ,Jz]N

holds for separable states, where

Jz=n=1Njz(n),

and jz(n) is a single particle angular momentum component. The maximum for general quantum states is given by

FQ[ϱ,Jz]N2.

Hence, quantum entanglement is needed to reach the maximum precision in quantum metrology. Moreover, for quantum states with an entanglement depth k,

FQ[ϱ,Jz]sk2+r2

holds, where s=N/k is the largest integer smaller than or equal to N/k, and r=Nsk is the remainder from dividing N by k. Hence, a higher and higher levels of multipartite entanglement is needed to achieve a better and better accuracy in parameter estimation.[21][22] It is possible to obtain a weaker but simpler bound [23]

FQ[ϱ,Jz]Nk.

Hence, a lower bound on the entanglement depth is obtained as

FQ[ϱ,Jz]Nk.

A related concept is the quantum metrological gain, which for a given Hamiltonian is defined as the ratio of the quantum Fisher information of a state and the maximum of the quantum Fisher information for the same Hamiltonian for separable states

g(ϱ)=Q[ϱ,]Q(sep)(),

where the Hamiltonian is

=h1+h2+...+hN,

and hn acts on the nth spin. The metrological gain is defined by an optimization over all local Hamiltonians as

g(ϱ)=maxg(ϱ).

Measuring the Fisher information

The error propagation formula gives a lower bound on the quantum Fisher information

FQ[ϱ,H]i[H,M]ϱ2(ΔM)2,

where M is an operator. This formula can be used to put a lower on the quantum Fisher information from experimental results.[24] If M equals the symmetric logarithmic derivative then the inequality is saturated.[25]

For the case of unitary dynamics, the quantum Fisher information is the convex roof of the variance. Based on that, one can obtain lower bounds on it, based on some given operator expectation values using semidefinite programming. The approach considers an optimizaton on the two-copy space.[26]

There are numerical methods that provide an optimal lower bound for the quantum Fisher information based on the expectation values for some operators, using the theory of Legendre transforms and not semidefinite programming.[27] In some cases, the bounds can even be obtained analytically. For instance, for an N-qubit Greenberger-Horne-Zeilinger (GHZ) state

FQ[ϱ,Jz]N2(12FGHZ)2,

where for the fidelity with respect to the GHZ state

FGHZ=Tr(ϱ|GHZGHZ|)1/2

holds, otherwise the optimal lower bound is zero.

So far, we discussed bounding the quantum Fisher information for a unitary dynamics. It is also possible to bound the quantum Fisher information for the more general, non-unitary dynamics.[28] The approach is based on the relation between the fidelity and the quantum Fisher information and that the fidelity can be computed based on semidefinite programming.

For systems in thermal equibirum, the quantum Fisher information can be obtained from the dynamic susceptibility.[29]

Relation to the Wigner–Yanase skew information

The Wigner–Yanase skew information is defined as [30]

I(ϱ,H)=Tr(H2ϱ)Tr(HϱHϱ).

It follows that I(ϱ,H) is convex in ϱ.

For the quantum Fisher information and the Wigner–Yanase skew information, the inequality

FQ[ϱ,H]4I(ϱ,H)

holds, where there is an equality for pure states.

Relation to the variance

For any decomposition of the density matrix given by pk and |Ψk the relation [14]

(ΔH)2kpk(ΔH)Ψk214FQ[ϱ,H]

holds, where both inequalities are tight. That is, there is a decomposition for which the second inequality is saturated, which is the same as stating that the quantum Fisher information is the convex roof of the variance over four, discussed above. There is also a decomposition for which the first inequality is saturated, which means that the variance is its own concave roof [14]

(ΔH)2=sup{pk,|Ψk}kpk(ΔH)Ψk2.

Uncertainty relations with the quantum Fisher information and the variance

Knowing that the quantum Fisher information is the convex roof of the variance times four, we obtain the relation [31] (ΔA)2FQ[ϱ,B]|i[A,B]|2, which is stronger than the Heisenberg uncertainty relation. For a particle of spin-j, the following uncertainty relation holds (ΔJx)2+(ΔJy)2+(ΔJz)2j, where Jl are angular momentum components. The relation can be strengthened as [32][33] (ΔJx)2+(ΔJy)2+FQ[ϱ,Jz]/4j.

References

Template:Reflist