Entropy measures how much uncertainty there is in the state of a physical system.

Shannon Entropy

Key concept in classical information theory is Shannon entropy. Suppose we learn the value of a random variable . The Shannon entropy of quantifies how much information we gain, on average, after we learn the value of . Alternatively, the entropy of measures the amount of uncertainty about before we learn its value.

The entropy is a function of the probabilities of the different possible values of the random variable takes . The Shannon entropy associated with this probability distribution is defined by Where is base , meaning the entropy is measured in β€˜bits’.

Conditional Entropy and Mutual Information

The joint entropy of a pair of random variables and is defined The joint entropy measures the total uncertainty about the pair . Suppose we know the value of , so we have acquired bits of information about the pair. The remaining uncertainty about the pair is associated with the remaining lack of knowledge about , therefore the entropy of conditional on knowing is The mutual information content of and measures how much information and have in common. Suppose we add the information content of , , to the information content of . Information which is common to and will have been counted twice in this sum, while information not common will have been counted once. Subtracting off the joint information of the pair , , we obtain the mutual information of and Note the useful equality relating the conditional entropy and mutual information.

Von Neumann Entropy

Quantum states are described in a similar fashion, with density operators replacing probability distributions. Von Neumann defined the entropy of a quantum state by the formula where the is again base . If are the eigenvalues of , then the Von Neumann definition is also

Quantum Relative Entropy

Suppose we have and as two density operators, the relative entropy of and is defined by

Klein’s Inequality

Theorem: The quantum relative entropy is non-negative with equality iff

Measurements and Entropy

The effect of measurements on the entropy depends on the type of measurements we make. Suppose for example a projective measurements described by the projectors is performed on a quantum system, but we never learn the result of the measurement. If the state of the system before the measurement was then the state after is given by Theorem: Projective measurements increase entropy Suppose is a complete set of orthogonal projectors and is the density operator. Then the entropy of the state of the system after the measurement is at least as great as the original entropy with equality iff .