Loading AI tools
From Wikipedia, the free encyclopedia
In quantum information theory, strong subadditivity of quantum entropy (SSA) is the relation among the von Neumann entropies of various quantum subsystems of a larger quantum system consisting of three subsystems (or of one quantum system with three degrees of freedom). It is a basic theorem in modern quantum information theory. It was conjectured by D. W. Robinson and D. Ruelle[1] in 1966 and O. E. Lanford III and D. W. Robinson[2] in 1968 and proved in 1973 by E.H. Lieb and M.B. Ruskai,[3] building on results obtained by Lieb in his proof of the Wigner-Yanase-Dyson conjecture.[4]
The classical version of SSA was long known and appreciated in classical probability theory and information theory. The proof of this relation in the classical case is quite easy, but the quantum case is difficult because of the non-commutativity of the reduced density matrices describing the quantum subsystems.
Some useful references here include:
We use the following notation throughout the following: A Hilbert space is denoted by , and denotes the bounded linear operators on . Tensor products are denoted by superscripts, e.g., . The trace is denoted by .
A density matrix is a Hermitian, positive semi-definite matrix of trace one. It allows for the description of a quantum system in a mixed state. Density matrices on a tensor product are denoted by superscripts, e.g., is a density matrix on .
The von Neumann quantum entropy of a density matrix is
Umegaki's[8] quantum relative entropy of two density matrices and is
A function of two variables is said to be jointly concave if for any the following holds
Ordinary subadditivity [9] concerns only two spaces and a density matrix . It states that
This inequality is true, of course, in classical probability theory, but the latter also contains the theorem that the conditional entropies and are both non-negative. In the quantum case, however, both can be negative, e.g. can be zero while . Nevertheless, the subadditivity upper bound on continues to hold. The closest thing one has to is the Araki–Lieb triangle inequality [9]
which is derived in [9] from subadditivity by a mathematical technique known as purification.
Suppose that the Hilbert space of the system is a tensor product of three spaces: . Physically, these three spaces can be interpreted as the space of three different systems, or else as three parts or three degrees of freedom of one physical system.
Given a density matrix on , we define a density matrix on as a partial trace: . Similarly, we can define density matrices: , , , , .
For any tri-partite state the following holds
where , for example.
Equivalently, the statement can be recast in terms of conditional entropies to show that for tripartite state ,
This can also be restated in terms of quantum mutual information,
These statements run parallel to classical intuition, except that quantum conditional entropies can be negative, and quantum mutual informations can exceed the classical bound of the marginal entropy.
The strong subadditivity inequality was improved in the following way by Carlen and Lieb [10]
with the optimal constant .
J. Kiefer[11][12] proved a peripherally related convexity result in 1959, which is a corollary of an operator Schwarz inequality proved by E.H.Lieb and M.B.Ruskai.[3] However, these results are comparatively simple, and the proofs do not use the results of Lieb's 1973 paper on convex and concave trace functionals.[4] It was this paper that provided the mathematical basis of the proof of SSA by Lieb and Ruskai. The extension from a Hilbert space setting to a von Neumann algebra setting, where states are not given by density matrices, was done by Narnhofer and Thirring .[13]
The theorem can also be obtained by proving numerous equivalent statements, some of which are summarized below.
E. P. Wigner and M. M. Yanase [14] proposed a different definition of entropy, which was generalized by Freeman Dyson.
The Wigner–Yanase–Dyson -skew information of a density matrix . with respect to an operator is
where is a commutator, is the adjoint of and is fixed.
It was conjectured by E. P. Wigner and M. M. Yanase in [15] that - skew information is concave as a function of a density matrix for a fixed .
Since the term is concave (it is linear), the conjecture reduces to the problem of concavity of . As noted in,[4] this conjecture (for all ) implies SSA, and was proved for in,[15] and for all in [4] in the following more general form: The function of two matrix variables
(1) |
is jointly concave in and when and .
This theorem is an essential part of the proof of SSA in.[3]
In their paper [15] E. P. Wigner and M. M. Yanase also conjectured the subadditivity of -skew information for , which was disproved by Hansen[16] by giving a counterexample.
It was pointed out in [9] that the first statement below is equivalent to SSA and A. Ulhmann in [17] showed the equivalence between the second statement below and SSA.
Both of these statements were proved directly in.[3]
As noted by Lindblad[18] and Uhlmann,[19] if, in equation (1), one takes and and and differentiates in at , one obtains the joint convexity of relative entropy: i.e., if , and , then
(2) |
where with .
The relative entropy decreases monotonically under completely positive trace preserving (CPTP) operations on density matrices,
.
This inequality is called Monotonicity of quantum relative entropy. Owing to the Stinespring factorization theorem, this inequality is a consequence of a particular choice of the CPTP map - a partial trace map described below.
The most important and basic class of CPTP maps is a partial trace operation , given by . Then
(3) |
which is called Monotonicity of quantum relative entropy under partial trace.
To see how this follows from the joint convexity of relative entropy, observe that can be written in Uhlmann's representation as
for some finite and some collection of unitary matrices on (alternatively, integrate over Haar measure). Since the trace (and hence the relative entropy) is unitarily invariant, inequality (3) now follows from (2). This theorem is due to Lindblad [18] and Uhlmann,[17] whose proof is the one given here.
SSA is obtained from (3) with replaced by and replaced . Take . Then (3) becomes
Therefore,
which is SSA. Thus, the monotonicity of quantum relative entropy (which follows from (1) implies SSA.
All of the above important inequalities are equivalent to each other, and can also be proved directly. The following are equivalent:
The following implications show the equivalence between these inequalities.
is convex. In [3] it was observed that this convexity yields MPT;
Moreover, if is pure, then and , so the equality holds in the above inequality. Since the extreme points of the convex set of density matrices are pure states, SSA follows from JC;
In,[23][24] D. Petz showed that the only case of equality in the monotonicity relation is to have a proper "recovery" channel:
For all states and on a Hilbert space and all quantum operators ,
if and only if there exists a quantum operator such that
Moreover, can be given explicitly by the formula
where is the adjoint map of .
D. Petz also gave another condition [23] when the equality holds in Monotonicity of quantum relative entropy: the first statement below. Differentiating it at we have the second condition. Moreover, M.B. Ruskai gave another proof of the second statement.
For all states and on and all quantum operators ,
if and only if the following equivalent conditions are satisfied:
where is the adjoint map of .
P. Hayden, R. Jozsa, D. Petz and A. Winter described the states for which the equality holds in SSA.[25]
A state on a Hilbert space satisfies strong subadditivity with equality if and only if there is a decomposition of second system as
into a direct sum of tensor products, such that
with states on and on , and a probability distribution .
E. H. Lieb and E.A. Carlen have found an explicit error term in the SSA inequality,[10] namely,
If and , as is always the case for the classical Shannon entropy, this inequality has nothing to say. For the quantum entropy, on the other hand, it is quite possible that the conditional entropies satisfy or (but never both!). Then, in this "highly quantum" regime, this inequality provides additional information.
The constant 2 is optimal, in the sense that for any constant larger than 2, one can find a state for which the inequality is violated with that constant.
In his paper [26] I. Kim studied an operator extension of strong subadditivity, proving the following inequality:
For a tri-partite state (density matrix) on ,
The proof of this inequality is based on Effros's theorem,[27] for which particular functions and operators are chosen to derive the inequality above. M. B. Ruskai describes this work in details in [28] and discusses how to prove a large class of new matrix inequalities in the tri-partite and bi-partite cases by taking a partial trace over all but one of the spaces.
A significant strengthening of strong subadditivity was proved in 2014,[29] which was subsequently improved in [30] and.[31] In 2017,[32] it was shown that the recovery channel can be taken to be the original Petz recovery map. These improvements of strong subadditivity have physical interpretations in terms of recoverability, meaning that if the conditional mutual information of a tripartite quantum state is nearly equal to zero, then it is possible to perform a recovery channel (from system E to AE) such that . These results thus generalize the exact equality conditions mentioned above.
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.