The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory.
, represented as density operators that are subparts of a quantum system, the joint quantum entropy is a measure of the total uncertainty or entropy of the joint system.
, depending on the notation being used for the von Neumann entropy.
In information theory, for any classical random variable
is a probability distribution concentrated at one point, the outcome of
Indeed, such uniform probability distributions have maximum possible entropy
In quantum information theory, the notion of entropy is extended from probability distributions to quantum states, or density matrices.
, the von Neumann entropy is defined by Applying the spectral theorem, or Borel functional calculus for infinite dimensional systems, we see that it generalizes the classical entropy.
A maximally mixed state, the quantum analog of the uniform probability distribution, has maximum von Neumann entropy.
On the other hand, a pure state, or a rank one projection, will have zero von Neumann entropy.
We write the von Neumann entropy
Given a quantum system with two subsystems A and B, the term joint quantum entropy simply refers to the von Neumann entropy of the combined system.
In symbols, if the combined system is in state
The state of the subsystems are given by the partial trace operation.
This is not the case for the joint quantum entropy.
This is equivalent to the fact that the conditional quantum entropy may be negative, while the classical conditional entropy may never be.
is a Bell state, say, then the total system is a pure state, with entropy 0, while each individual subsystem is a maximally mixed state, with maximum von Neumann entropy
Thus the joint entropy of the combined system is less than that of subsystems.
In that case, the reduced states of the subsystems are also pure.
can be used to define of the conditional quantum entropy: and the quantum mutual information: These definitions parallel the use of the classical joint entropy to define the conditional entropy and mutual information.