Von Neumann entropy

From testwiki
Jump to navigation Jump to search

Template:Short description

John von Neumann, whom the topic is named after

In physics, the von Neumann entropy, named after John von Neumann, is a measure of the statistical uncertainty within a description of a quantum system. It extends the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics, and it is the quantum counterpart of the Shannon entropy from classical information theory. For a quantum-mechanical system described by a density matrix Template:Mvar, the von Neumann entropy isTemplate:Sfnm S=tr(ρlnρ), where tr denotes the trace and ln denotes the matrix version of the natural logarithm. If the density matrix Template:Mvar is written in a basis of its eigenvectors |1,|2,|3, as ρ=jηj|jj|, then the von Neumann entropy is merely S=jηjlnηj. In this form, S can be seen as the Shannon entropy of the eigenvalues, reinterpreted as probabilities.Template:Sfnm

The von Neumann entropy and quantities based upon it are widely used in the study of quantum entanglement.Template:Sfn

Fundamentals

Template:Main In quantum mechanics, probabilities for the outcomes of experiments made upon a system are calculated from the quantum state describing that system. Each physical system is associated with a vector space, or more specifically a Hilbert space. The dimension of the Hilbert space may be infinite, as it is for the space of square-integrable functions on a line, which is used to define the quantum physics of a continuous degree of freedom. Alternatively, the Hilbert space may be finite-dimensional, as occurs for spin degrees of freedom. A density operator, the mathematical representation of a quantum state, is a positive semi-definite, self-adjoint operator of trace one acting on the Hilbert space of the system.[1]Template:Sfn[2] A density operator that is a rank-1 projection is known as a pure quantum state, and all quantum states that are not pure are designated mixed. Pure states are also known as wavefunctions. Assigning a pure state to a quantum system implies certainty about the outcome of some measurement on that system (i.e., P(x)=1 for some outcome x). The state space of a quantum system is the set of all states, pure and mixed, that can be assigned to it. For any system, the state space is a convex set: Any mixed state can be written as a convex combination of pure states, though not in a unique way.[3] The von Neumann entropy quantifies the extent to which a state is mixed.Template:Sfn

The prototypical example of a finite-dimensional Hilbert space is a qubit, a quantum system whose Hilbert space is 2-dimensional. An arbitrary state for a qubit can be written as a linear combination of the Pauli matrices, which provide a basis for 2×2 self-adjoint matrices:Template:Sfnm ρ=12(I+rxσx+ryσy+rzσz), where the real numbers (rx,ry,rz) are the coordinates of a point within the unit ball and σx=(0110),σy=(0ii0),σz=(1001). The von Neumann entropy vanishes when ρ is a pure state, i.e., when the point (rx,ry,rz) lies on the surface of the unit ball, and it attains its maximum value when ρ is the maximally mixed state, which is given by rx=ry=rz=0.Template:Sfnm

Properties

Some properties of the von Neumann entropy:

S(i=1kλiρi)i=1kλiS(ρi).

S(ρAρB)=S(ρA)+S(ρB).

S(ρABC)+S(ρB)S(ρAB)+S(ρBC).

This automatically means that Template:Math is subadditive:

S(ρAC)S(ρA)+S(ρC).

Below, the concept of subadditivity is discussed, followed by its generalization to strong subadditivity.

Subadditivity

If Template:Math are the reduced density matrices of the general state Template:Math, then |S(ρA)S(ρB)|S(ρAB)S(ρA)+S(ρB).

The right hand inequality is known as subadditivity, and the left is sometimes known as the triangle inequality.Template:Sfn While in Shannon's theory the entropy of a composite system can never be lower than the entropy of any of its parts, in quantum theory this is not the case; i.e., it is possible that Template:Math, while Template:Math. This is expressed by saying that the Shannon entropy is monotonic but the von Neumann entropy is not.Template:Sfn For example, take the Bell state of two spin-1/2 particles: |ψ=|+|. This is a pure state with zero entropy, but each spin has maximum entropy when considered individually, because its reduced density matrix is the maximally mixed state. This indicates that it is an entangled state;Template:Sfnm the use of entropy as an entanglement measure is discussed further below.

Strong subadditivity

Template:Main The von Neumann entropy is also strongly subadditive.Template:Sfn Given three Hilbert spaces, A, B, C, S(ρABC)+S(ρB)S(ρAB)+S(ρBC). By using the proof technique that establishes the left side of the triangle inequality above, one can show that the strong subadditivity inequality is equivalent to the following inequality: S(ρA)+S(ρC)S(ρAB)+S(ρBC) where Template:Math, etc. are the reduced density matrices of a density matrix Template:Math.[4] If we apply ordinary subadditivity to the left side of this inequality, we then find S(ρAC)S(ρAB)+S(ρBC). By symmetry, for any tripartite state Template:Math, each of the three numbers Template:Math is less than or equal to the sum of the other two.[5]

Minimum Shannon entropy

Given a quantum state and a specification of a quantum measurement, we can calculate the probabilities for the different possible results of that measurement, and thus we can find the Shannon entropy of that probability distribution. A quantum measurement can be specified mathematically as a positive operator valued measure, or POVM.[6] In the simplest case, a system with a finite-dimensional Hilbert space and measurement with a finite number of outcomes, a POVM is a set of positive semi-definite matrices {Fi} on the Hilbert space that sum to the identity matrix,Template:Sfn i=1nFi=I. The POVM element Fi is associated with the measurement outcome i, such that the probability of obtaining it when making a measurement on the quantum state ρ is given by Prob(i)=tr(ρFi). A POVM is rank-1 if all of the elements are proportional to rank-1 projection operators. The von Neumann entropy is the minimum achievable Shannon entropy, where the minimization is taken over all rank-1 POVMs.Template:Sfn

Holevo χ quantity

Template:See also If Template:Math are density operators and Template:Math is a collection of positive numbers which sum to unity (Σiλi=1), then ρ=i=1kλiρi is a valid density operator, and the difference between its von Neumann entropy and the weighted average of the entropies of the Template:Math is bounded by the Shannon entropy of the Template:Math: S(i=1kλiρi)i=1kλiS(ρi)i=1kλilogλi. Equality is attained when the supports of the Template:Math – the spaces spanned by their eigenvectors corresponding to nonzero eigenvalues – are orthogonal. The difference on the left-hand side of this inequality is known as the Holevo χ quantity and also appears in Holevo's theorem, an important result in quantum information theory.Template:Sfn

Change under time evolution

Unitary

The time evolution of an isolated system is described by a unitary operator: ρUρU. Unitary evolution takes pure states into pure states,Template:Sfnm and it leaves the von Neumann entropy unchanged. This follows from the fact that the entropy of ρ is a function of the eigenvalues of ρ.Template:Sfnm

Measurement

A measurement upon a quantum system will generally bring about a change of the quantum state of that system. Writing a POVM does not provide the complete information necessary to describe this state-change process.Template:Sfn To remedy this, further information is specified by decomposing each POVM element into a product: Ei=AiAi. The Kraus operators Ai, named for Karl Kraus, provide a specification of the state-change process. They are not necessarily self-adjoint, but the products AiAi are. If upon performing the measurement the outcome Ei is obtained, then the initial state ρ is updated to ρρ=AiρAiProb(i)=AiρAitr(ρEi). An important special case is the Lüders rule, named for Gerhart Lüders.[7][8] If the POVM elements are projection operators, then the Kraus operators can be taken to be the projectors themselves: ρρ=ΠiρΠitr(ρΠi). If the initial state ρ is pure, and the projectors Πi have rank 1, they can be written as projectors onto the vectors |ψ and |i, respectively. The formula simplifies thus to ρ=|ψψ|ρ=|ii|ψψ|ii||i|ψ|2=|ii|. We can define a linear, trace-preserving, completely positive map, by summing over all the possible post-measurement states of a POVM without the normalisation: ρiAiρAi. It is an example of a quantum channel,Template:Sfn and can be interpreted as expressing how a quantum state changes if a measurement is performed but the result of that measurement is lost.Template:Sfn Channels defined by projective measurements can never decrease the von Neumann entropy; they leave the entropy unchanged only if they do not change the density matrix.Template:Sfn A quantum channel will increase or leave constant the von Neumann entropy of every input state if and only if the channel is unital, i.e., if it leaves fixed the maximally mixed state. An example of a channel that decreases the von Neumann entropy is the amplitude damping channel for a qubit, which sends all mixed states towards a pure state.Template:Sfn

Thermodynamic meaning

The quantum version of the canonical distribution, the Gibbs states, are found by maximizing the von Neumann entropy under the constraint that the expected value of the Hamiltonian is fixed. A Gibbs state is a density operator with the same eigenvectors as the Hamiltonian, and its eigenvalues are λi=1Zexp(EikBT), where T is the temperature, kB is the Boltzmann constant, and Z is the partition function.Template:Sfnm[9] The von Neumann entropy of a Gibbs state is, up to a factor kB, the thermodynamic entropy.Template:Sfnm

Generalizations and derived quantities

Conditional entropy

Let ρAB be a joint state for the bipartite quantum system AB. Then the conditional von Neumann entropy S(A|B) is the difference between the entropy of ρAB and the entropy of the marginal state for subsystem B alone: S(A|B)=S(ρAB)S(ρB). This is bounded above by S(ρA). In other words, conditioning the description of subsystem A upon subsystem B cannot increase the entropy associated with A.Template:Sfn

Quantum mutual information can be defined as the difference between the entropy of the joint state and the total entropy of the marginals: S(A:B)=S(ρA)+S(ρB)S(ρAB), which can also be expressed in terms of conditional entropy:Template:Sfn S(A:B)=S(A)S(A|B)=S(B)S(B|A).

Relative entropy

Template:Main Let ρ and σ be two density operators in the same state space. The relative entropy is defined to be S(σ|ρ)=tr[ρ(logρlogσ)]. The relative entropy is always greater than or equal to zero; it equals zero if and only if ρ=σ.Template:Sfnm Unlike the von Neumann entropy itself, the relative entropy is monotonic, in that it decreases (or remains constant) when part of a system is traced over:Template:Sfn S(σA|ρA)S(σAB|ρAB).

Entanglement measures

Just as energy is a resource that facilitates mechanical operations, entanglement is a resource that facilitates performing tasks that involve communication and computation.Template:Sfnm The mathematical definition of entanglement can be paraphrased as saying that maximal knowledge about the whole of a system does not imply maximal knowledge about the individual parts of that system.Template:Sfn If the quantum state that describes a pair of particles is entangled, then the results of measurements upon one half of the pair can be strongly correlated with the results of measurements upon the other. However, entanglement is not the same as "correlation" as understood in classical probability theory and in daily life. Instead, entanglement can be thought of as potential correlation that can be used to generate actual correlation in an appropriate experiment.[10] The state of a composite system is always expressible as a sum, or superposition, of products of states of local constituents; it is entangled if this sum cannot be written as a single product term.Template:Sfn Entropy provides one tool that can be used to quantify entanglement.[11][12] If the overall system is described by a pure state, the entropy of one subsystem can be used to measure its degree of entanglement with the other subsystems. For bipartite pure states, the von Neumann entropy of reduced states is the unique measure of entanglement in the sense that it is the only function on the family of states that satisfies certain axioms required of an entanglement measure.Template:Sfn[13] It is thus known as the entanglement entropy.Template:Sfn

It is a classical result that the Shannon entropy achieves its maximum at, and only at, the uniform probability distribution Template:Mset.Template:Sfn Therefore, a bipartite pure state Template:Math is said to be a maximally entangled state if the reduced state of each subsystem of Template:Mvar is the diagonal matrix[14] (1n1n).

For mixed states, the reduced von Neumann entropy is not the only reasonable entanglement measure.Template:Sfnm Some of the other measures are also entropic in character. For example, the relative entropy of entanglement is given by minimizing the relative entropy between a given state ρ and the set of nonentangled, or separable, states.Template:Sfn The entanglement of formation is defined by minimizing, over all possible ways of writing of ρ as a convex combination of pure states, the average entanglement entropy of those pure states.Template:Sfn The squashed entanglement is based on the idea of extending a bipartite state ρAB to a state describing a larger system, ρABE, such that the partial trace of ρABE over E yields ρAB. One then finds the infimum of the quantity 12[S(ρAE)+S(ρBE)S(ρE)S(ρABE)], over all possible choices of ρABE.Template:Sfn

Quantum Rényi entropies

Just as the Shannon entropy function is one member of the broader family of classical Rényi entropies, so too can the von Neumann entropy be generalized to the quantum Rényi entropies: Sα(ρ)=11αln[trρα]=11αlni=1Nλiα. In the limit that α1, this recovers the von Neumann entropy. The quantum Rényi entropies are all additive for product states, and for any α, the Rényi entropy Sα vanishes for pure states and is maximized by the maximally mixed state. For any given state ρ, Sα(ρ) is a continuous, nonincreasing function of the parameter α. A weak version of subadditivity can be proven: Sα(ρA)S0(ρB)Sα(ρAB)Sα(ρA)+S0(ρB). Here, S0 is the quantum version of the Hartley entropy, i.e., the logarithm of the rank of the density matrix.Template:Sfn

History

The density matrix was introduced, with different motivations, by von Neumann and by Lev Landau. The motivation that inspired Landau was the impossibility of describing a subsystem of a composite quantum system by a state vector.[15] On the other hand, von Neumann introduced the density matrix in order to develop both quantum statistical mechanics and a theory of quantum measurements.[16] He introduced the expression now known as von Neumann entropy by arguing that a probabilistic combination of pure states is analogous to a mixture of ideal gases.Template:Sfn[17] Von Neumann first published on the topic in 1927.[18] His argument was built upon earlier work by Albert Einstein and Leo Szilard.[19][20][21]

Max Delbrück and Gert Molière proved the concavity and subadditivity properties of the von Neumann entropy in 1936. Quantum relative entropy was introduced by Hisaharu Umegaki in 1962.Template:Sfn[22] The subadditivity and triangle inequalities were proved in 1970 by Huzihiro Araki and Elliott H. Lieb.[23] Strong subadditivity is a more difficult theorem. It was conjectured by Oscar Lanford and Derek Robinson in 1968.[24] Lieb and Mary Beth Ruskai proved the theorem in 1973,[25][26] using a matrix inequality proved earlier by Lieb.[27]Template:Sfn

References

Template:Reflist

Template:Statistical mechanics topics