Von Neumann entropy

In physics, the von Neumann entropy, named after John von Neumann, is a measure of the statistical uncertainty within a description of a quantum system. It extends the concept of Gibbs entropy from classical statistical mechanics to quantum statistical mechanics, and it is the quantum counterpart of the Shannon entropy from classical information theory. For a quantum-mechanical system described by a density matrix Template:Mvar, the von Neumann entropy isTemplate:Sfnm where denotes the trace and denotes the matrix version of the natural logarithm. If the density matrix Template:Mvar is written in a basis of its eigenvectors as then the von Neumann entropy is merely In this form, S can be seen as the Shannon entropy of the eigenvalues, reinterpreted as probabilities.Template:Sfnm
The von Neumann entropy and quantities based upon it are widely used in the study of quantum entanglement.Template:Sfn
Fundamentals
Template:Main In quantum mechanics, probabilities for the outcomes of experiments made upon a system are calculated from the quantum state describing that system. Each physical system is associated with a vector space, or more specifically a Hilbert space. The dimension of the Hilbert space may be infinite, as it is for the space of square-integrable functions on a line, which is used to define the quantum physics of a continuous degree of freedom. Alternatively, the Hilbert space may be finite-dimensional, as occurs for spin degrees of freedom. A density operator, the mathematical representation of a quantum state, is a positive semi-definite, self-adjoint operator of trace one acting on the Hilbert space of the system.[1]Template:Sfn[2] A density operator that is a rank-1 projection is known as a pure quantum state, and all quantum states that are not pure are designated mixed. Pure states are also known as wavefunctions. Assigning a pure state to a quantum system implies certainty about the outcome of some measurement on that system (i.e., for some outcome ). The state space of a quantum system is the set of all states, pure and mixed, that can be assigned to it. For any system, the state space is a convex set: Any mixed state can be written as a convex combination of pure states, though not in a unique way.[3] The von Neumann entropy quantifies the extent to which a state is mixed.Template:Sfn
The prototypical example of a finite-dimensional Hilbert space is a qubit, a quantum system whose Hilbert space is 2-dimensional. An arbitrary state for a qubit can be written as a linear combination of the Pauli matrices, which provide a basis for self-adjoint matrices:Template:Sfnm where the real numbers are the coordinates of a point within the unit ball and The von Neumann entropy vanishes when is a pure state, i.e., when the point lies on the surface of the unit ball, and it attains its maximum value when is the maximally mixed state, which is given by .Template:Sfnm
Properties
Some properties of the von Neumann entropy:
- Template:Math is zero if and only if Template:Math represents a pure state.Template:Sfnm
- Template:Math is maximal and equal to for a maximally mixed state, Template:Math being the dimension of the Hilbert space.Template:Sfnm
- Template:Math is invariant under changes in the basis of Template:Math, that is, Template:Math, with Template:Math a unitary transformation.Template:Sfnm
- Template:Math is concave, that is, given a collection of positive numbers Template:Math which sum to unity () and density operators Template:Math, we haveTemplate:Sfnm
- Template:Math is additive for independent systems. Given two density matrices Template:Math describing independent systems A and B, we haveTemplate:Sfnm
- Template:Math is strongly subadditive. That is, for any three systems A, B, and C:Template:Sfn
- This automatically means that Template:Math is subadditive:
Below, the concept of subadditivity is discussed, followed by its generalization to strong subadditivity.
Subadditivity
If Template:Math are the reduced density matrices of the general state Template:Math, then
The right hand inequality is known as subadditivity, and the left is sometimes known as the triangle inequality.Template:Sfn While in Shannon's theory the entropy of a composite system can never be lower than the entropy of any of its parts, in quantum theory this is not the case; i.e., it is possible that Template:Math, while Template:Math. This is expressed by saying that the Shannon entropy is monotonic but the von Neumann entropy is not.Template:Sfn For example, take the Bell state of two spin-1/2 particles: This is a pure state with zero entropy, but each spin has maximum entropy when considered individually, because its reduced density matrix is the maximally mixed state. This indicates that it is an entangled state;Template:Sfnm the use of entropy as an entanglement measure is discussed further below.
Strong subadditivity
Template:Main The von Neumann entropy is also strongly subadditive.Template:Sfn Given three Hilbert spaces, A, B, C, By using the proof technique that establishes the left side of the triangle inequality above, one can show that the strong subadditivity inequality is equivalent to the following inequality: where Template:Math, etc. are the reduced density matrices of a density matrix Template:Math.[4] If we apply ordinary subadditivity to the left side of this inequality, we then find By symmetry, for any tripartite state Template:Math, each of the three numbers Template:Math is less than or equal to the sum of the other two.[5]
Minimum Shannon entropy
Given a quantum state and a specification of a quantum measurement, we can calculate the probabilities for the different possible results of that measurement, and thus we can find the Shannon entropy of that probability distribution. A quantum measurement can be specified mathematically as a positive operator valued measure, or POVM.[6] In the simplest case, a system with a finite-dimensional Hilbert space and measurement with a finite number of outcomes, a POVM is a set of positive semi-definite matrices on the Hilbert space that sum to the identity matrix,Template:Sfn The POVM element is associated with the measurement outcome , such that the probability of obtaining it when making a measurement on the quantum state is given by A POVM is rank-1 if all of the elements are proportional to rank-1 projection operators. The von Neumann entropy is the minimum achievable Shannon entropy, where the minimization is taken over all rank-1 POVMs.Template:Sfn
Holevo χ quantity
Template:See also If Template:Math are density operators and Template:Math is a collection of positive numbers which sum to unity (), then is a valid density operator, and the difference between its von Neumann entropy and the weighted average of the entropies of the Template:Math is bounded by the Shannon entropy of the Template:Math: Equality is attained when the supports of the Template:Math – the spaces spanned by their eigenvectors corresponding to nonzero eigenvalues – are orthogonal. The difference on the left-hand side of this inequality is known as the Holevo χ quantity and also appears in Holevo's theorem, an important result in quantum information theory.Template:Sfn
Change under time evolution
Unitary
The time evolution of an isolated system is described by a unitary operator: Unitary evolution takes pure states into pure states,Template:Sfnm and it leaves the von Neumann entropy unchanged. This follows from the fact that the entropy of is a function of the eigenvalues of .Template:Sfnm
Measurement
A measurement upon a quantum system will generally bring about a change of the quantum state of that system. Writing a POVM does not provide the complete information necessary to describe this state-change process.Template:Sfn To remedy this, further information is specified by decomposing each POVM element into a product: The Kraus operators , named for Karl Kraus, provide a specification of the state-change process. They are not necessarily self-adjoint, but the products are. If upon performing the measurement the outcome is obtained, then the initial state is updated to An important special case is the Lüders rule, named for Gerhart Lüders.[7][8] If the POVM elements are projection operators, then the Kraus operators can be taken to be the projectors themselves: If the initial state is pure, and the projectors have rank 1, they can be written as projectors onto the vectors and , respectively. The formula simplifies thus to We can define a linear, trace-preserving, completely positive map, by summing over all the possible post-measurement states of a POVM without the normalisation: It is an example of a quantum channel,Template:Sfn and can be interpreted as expressing how a quantum state changes if a measurement is performed but the result of that measurement is lost.Template:Sfn Channels defined by projective measurements can never decrease the von Neumann entropy; they leave the entropy unchanged only if they do not change the density matrix.Template:Sfn A quantum channel will increase or leave constant the von Neumann entropy of every input state if and only if the channel is unital, i.e., if it leaves fixed the maximally mixed state. An example of a channel that decreases the von Neumann entropy is the amplitude damping channel for a qubit, which sends all mixed states towards a pure state.Template:Sfn
Thermodynamic meaning
The quantum version of the canonical distribution, the Gibbs states, are found by maximizing the von Neumann entropy under the constraint that the expected value of the Hamiltonian is fixed. A Gibbs state is a density operator with the same eigenvectors as the Hamiltonian, and its eigenvalues are where T is the temperature, is the Boltzmann constant, and Z is the partition function.Template:Sfnm[9] The von Neumann entropy of a Gibbs state is, up to a factor , the thermodynamic entropy.Template:Sfnm
Generalizations and derived quantities
Conditional entropy
Let be a joint state for the bipartite quantum system AB. Then the conditional von Neumann entropy is the difference between the entropy of and the entropy of the marginal state for subsystem B alone: This is bounded above by . In other words, conditioning the description of subsystem A upon subsystem B cannot increase the entropy associated with A.Template:Sfn
Quantum mutual information can be defined as the difference between the entropy of the joint state and the total entropy of the marginals: which can also be expressed in terms of conditional entropy:Template:Sfn
Relative entropy
Template:Main Let and be two density operators in the same state space. The relative entropy is defined to be The relative entropy is always greater than or equal to zero; it equals zero if and only if .Template:Sfnm Unlike the von Neumann entropy itself, the relative entropy is monotonic, in that it decreases (or remains constant) when part of a system is traced over:Template:Sfn
Entanglement measures
Just as energy is a resource that facilitates mechanical operations, entanglement is a resource that facilitates performing tasks that involve communication and computation.Template:Sfnm The mathematical definition of entanglement can be paraphrased as saying that maximal knowledge about the whole of a system does not imply maximal knowledge about the individual parts of that system.Template:Sfn If the quantum state that describes a pair of particles is entangled, then the results of measurements upon one half of the pair can be strongly correlated with the results of measurements upon the other. However, entanglement is not the same as "correlation" as understood in classical probability theory and in daily life. Instead, entanglement can be thought of as potential correlation that can be used to generate actual correlation in an appropriate experiment.[10] The state of a composite system is always expressible as a sum, or superposition, of products of states of local constituents; it is entangled if this sum cannot be written as a single product term.Template:Sfn Entropy provides one tool that can be used to quantify entanglement.[11][12] If the overall system is described by a pure state, the entropy of one subsystem can be used to measure its degree of entanglement with the other subsystems. For bipartite pure states, the von Neumann entropy of reduced states is the unique measure of entanglement in the sense that it is the only function on the family of states that satisfies certain axioms required of an entanglement measure.Template:Sfn[13] It is thus known as the entanglement entropy.Template:Sfn
It is a classical result that the Shannon entropy achieves its maximum at, and only at, the uniform probability distribution Template:Mset.Template:Sfn Therefore, a bipartite pure state Template:Math is said to be a maximally entangled state if the reduced state of each subsystem of Template:Mvar is the diagonal matrix[14]
For mixed states, the reduced von Neumann entropy is not the only reasonable entanglement measure.Template:Sfnm Some of the other measures are also entropic in character. For example, the relative entropy of entanglement is given by minimizing the relative entropy between a given state and the set of nonentangled, or separable, states.Template:Sfn The entanglement of formation is defined by minimizing, over all possible ways of writing of as a convex combination of pure states, the average entanglement entropy of those pure states.Template:Sfn The squashed entanglement is based on the idea of extending a bipartite state to a state describing a larger system, , such that the partial trace of over E yields . One then finds the infimum of the quantity over all possible choices of .Template:Sfn
Quantum Rényi entropies
Just as the Shannon entropy function is one member of the broader family of classical Rényi entropies, so too can the von Neumann entropy be generalized to the quantum Rényi entropies: In the limit that , this recovers the von Neumann entropy. The quantum Rényi entropies are all additive for product states, and for any , the Rényi entropy vanishes for pure states and is maximized by the maximally mixed state. For any given state , is a continuous, nonincreasing function of the parameter . A weak version of subadditivity can be proven: Here, is the quantum version of the Hartley entropy, i.e., the logarithm of the rank of the density matrix.Template:Sfn
History
The density matrix was introduced, with different motivations, by von Neumann and by Lev Landau. The motivation that inspired Landau was the impossibility of describing a subsystem of a composite quantum system by a state vector.[15] On the other hand, von Neumann introduced the density matrix in order to develop both quantum statistical mechanics and a theory of quantum measurements.[16] He introduced the expression now known as von Neumann entropy by arguing that a probabilistic combination of pure states is analogous to a mixture of ideal gases.Template:Sfn[17] Von Neumann first published on the topic in 1927.[18] His argument was built upon earlier work by Albert Einstein and Leo Szilard.[19][20][21]
Max Delbrück and Gert Molière proved the concavity and subadditivity properties of the von Neumann entropy in 1936. Quantum relative entropy was introduced by Hisaharu Umegaki in 1962.Template:Sfn[22] The subadditivity and triangle inequalities were proved in 1970 by Huzihiro Araki and Elliott H. Lieb.[23] Strong subadditivity is a more difficult theorem. It was conjectured by Oscar Lanford and Derek Robinson in 1968.[24] Lieb and Mary Beth Ruskai proved the theorem in 1973,[25][26] using a matrix inequality proved earlier by Lieb.[27]Template:Sfn
References
- Template:Cite book
- Template:Cite book
- Template:Cite book
- Template:Cite book
- Template:Cite book
- Template:Cite book
- Template:Cite book
- Template:Cite book
- Template:Cite book
Template:Statistical mechanics topics
- ↑ Template:Cite journal
- ↑ Template:Cite book
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite journal Translated by K. A. Kirkpatrick as Template:Cite journal
- ↑ Template:Citation
- ↑ Template:Cite journal
- ↑ Template:Cite book
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite book; Template:Cite book
- ↑ Template:Cite journal
- ↑ Template:Cite book
- ↑ Template:Cite journal Translated in Template:Cite book
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite web
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite journal
- ↑ Template:Cite web Invited talk at the Conference in Honour of the 90th Birthday of Freeman Dyson, Institute of Advanced Studies, Nanyang Technological University, Singapore, 26–29 August 2013.Template:Cbignore
- ↑ Template:Cite journal