It is the quantum mechanical analog of relative entropy.
For simplicity, it will be assumed that all objects in the article are finite-dimensional.
Suppose the probabilities of a finite sequence of events is given by the probability distribution P = {p1...pn}, but somehow we mistakenly assumed it to be Q = {q1...qn}.
For instance, we can mistake an unfair coin for a fair one.
According to this erroneous assumption, our uncertainty about the j-th event, or equivalently, the amount of information provided after observing the j-th event, is The (assumed) average uncertainty of all possible events is then On the other hand, the Shannon entropy of the probability distribution p, defined by is the real amount of uncertainty before observation.
Therefore the difference between these two quantities is a measure of the distinguishability of the two probability distributions p and q.
This is precisely the classical relative entropy, or Kullback–Leibler divergence: Note As with many other objects in quantum information theory, quantum relative entropy is defined by extending the classical definition from probability distributions to density matrices.
The von Neumann entropy of ρ, which is the quantum mechanical analog of the Shannon entropy, is given by For two density matrices ρ and σ, the quantum relative entropy of ρ with respect to σ is defined by We see that, when the states are classically related, i.e. ρσ = σρ, the definition coincides with the classical case, in the sense that if
is just the ordinary Kullback-Leibler divergence of the probability vector
In general, the support of a matrix M is the orthogonal complement of its kernel, i.e.
When considering the quantum relative entropy, we assume the convention that −s · log 0 = ∞ for any s > 0.
Informally, the quantum relative entropy is a measure of our ability to distinguish two quantum states where larger values indicate states that are more different.
Being orthogonal represents the most different quantum states can be.
Following the argument given in the Motivation section, if we erroneously assume the state
However, one should be careful not to conclude that the divergence of the quantum relative entropy
differ by a vanishingly small amount as measured by some norm.
as measured by the trace norm is vanishingly small as
This property of the quantum relative entropy represents a serious shortcoming if not treated with care.
For the classical Kullback–Leibler divergence, it can be shown that and the equality holds if and only if P = Q. Colloquially, this means that the uncertainty calculated using erroneous assumptions is always greater than the real amount of uncertainty.
To show the inequality, we rewrite Notice that log is a concave function.
Applying Jensen's inequality, we obtain Jensen's inequality also states that equality holds if and only if, for all i, qi = (Σqj) pi, i.e. p = q. Klein's inequality states that the quantum relative entropy is non-negative in general.
Proof Let ρ and σ have spectral decompositions So Direct calculation gives where Pi j = |vi*wj|2.
Since the matrix (Pi j)i j is a doubly stochastic matrix and -log is a convex function, the above expression is Define ri = Σjqj Pi j.
From the non-negativity of classical relative entropy, we have The second part of the claim follows from the fact that, since -log is strictly convex, equality is achieved in if and only if (Pi j) is a permutation matrix, which implies ρ = σ, after a suitable labeling of the eigenvectors {vi} and {wi}.
The relative entropy decreases monotonically under completely positive trace preserving (CPTP) operations
This inequality is called monotonicity of quantum relative entropy and was first proved by Göran Lindblad.
Let a composite quantum system have state space and ρ be a density matrix acting on H. The relative entropy of entanglement of ρ is defined by where the minimum is taken over the family of separable states.
One reason the quantum relative entropy is useful is that several other important quantum information quantities are special cases of it.
Often, theorems are stated in terms of the quantum relative entropy, which lead to immediate corollaries concerning the other quantities.
Let ρAB be the joint state of a bipartite system with subsystem A of dimension nA and B of dimension nB.