A Hermitian matrix \(M \in \C^{d \times d}\) is said to be positive semidefinite, denoted \(M \succeq 0\) if it holds that \(\bra{u}M\ket{u} \geq 0\) for all \(\ket{u} \in \C^d\).
[3 pts] Prove that \(M \succeq 0\) if and only if \(\bra{u}M\ket{u} \geq 0\) holds for all unit vectors \(\ket{u} \in \C^d\).
[3 pts] Let \(M \in \C^{d \times d}\) be a diagonal matrix. Verify that \(M\) is Hermitian if and only if all its diagonal entries are real and \(M \succeq 0\) if and only if all diagonal entries is nonnegative.
[3 pts] Consider any matrix \(A \in \C^{d \times d'}\). Show that \(A^* A\) is Hermitian and \(A^*A \succeq 0\).
[3 pts] Recall the Frobenius inner product on matrices \(A,B \in \C^{d\times d}\):
\[\langle A,B\rangle \seteq \tr(A^*B).\]Prove that if \(A \succeq 0\) and \(B \succeq 0\), then \(\langle A,B\rangle \geq 0\).
You may use the fact that every Hermitian matrix \(M \in \C^{d \times d}\) can be written as \(M = \sum_{i=1}^d \lambda_i \ket{u_i}\bra{u_i}\) where \(\lambda_1,\ldots,\lambda_d\) are the real eigenvalues of \(M\) and \(\ket{u_1},\ldots,\ket{u_d}\) is an orthornormal basis of eigenvectors for \(M\).
Suppose that \(\rho^A\) is a $d \times d$ density matrix. Write $\rho^A$ in its eigenbasis:
\[\rho^A = \sum_{j=1}^n \lambda_j \ket{v_j}\bra{v_j}\,.\]Let \(\mathbb{C}^B\) denote a \(d\)-dimensional Hilbert space with basis \(\ket{1},\ket{2}\,\ldots,\ket{d}\) and define
\[\begin{align*} \ket{u^{AB}} & \seteq \sum_{j=1}^n \sqrt{\lambda_j} \ket{v_j} \ket{j} \\ \rho^{AB} &\seteq \ket{u^{AB}} \bra{u^{AB}} \end{align*}\]Show that \(\rho^A = \tr_B(\rho^{AB})\). In other words, the mixed state \(\rho^A\) can be seen as arising from taking a joint system in the pure state \(\ket{u^{AB}}\) and then discarding the \(B\)-part of the space.
Recall that if $X$ is a classical random variable such that \(p_i \seteq \Pr[X=i]\) for \(i=1,2,\ldots,d\), then the Shannon entropy of $X$ is defined by
\[H(X) \seteq \sum_{i=1}^d p_i \log \frac{1}{p_i}.\]This is a measure of the uncertainty of \(X\) measured in bits (or measured in “nats” if we use the natural logarithm).
Define the von Neumann entropy of a \(d \times d\) density matrix \(\rho\) by
\[\mathcal{S}(\rho) \seteq \sum_{j=1}^d \lambda_j \log \frac{1}{\lambda_j}\,.\]Suppose that \(\rho = \ket{u^{AB}} \bra{u^{AB}}\) is a pure state and \(\rho^A = \tr_B(\rho), \rho^B = \tr_A(\rho)\). Show that
\[\mathcal{S}(\rho^A) = \mathcal{S}(\rho^B)\,.\]Note that the two states don’t necessarily have the same dimension, so they could each have a different number of eigenvalues.
[ Hint: Show first that if \(U\) is a \(d \times d\) matrix, then \(UU^*\) and \(U^*U\) have the same non-zero eigenvalues. ]
[5 pts] In classical probability theory, if \(A\) and \(B\) are two random variables, one defines the entropy of \(A\) conditioned on \(B\) by the formula
\[H(A \mid B) \seteq \sum_{x} \Pr[B=x] \cdot H(A \mid \{B=x\})\,,\]where \(A \mid \{B=x\}\) is the random variable \(A\) condition on \(X\). This quantity is nonnegative because the entropy \(H(A \mid \{B=x\})\) is always nonnegative.
Prove that
\[H(A \mid B) = H(A,B) - H(B)\,.\]In particular, right-hand side is always nonnegative, and therefore
\[H(B) \leq H(A,B)\,.\]This asserts the relatively obvious fact that the pair of random variables \(\{A,B\}\) has more uncertainty than the single random variable \(B\). In other words, it is easier to predict $B$ than to simultaneously predict both $A$ and $B$.
[5 pts] You will show that this fails dramatically in the quantum setting where the conditional entropy can be negative! Using Problem 2, Show that there is a state \(\rho^{AB}\) with
\[\mathcal{S}(\rho^B) > \mathcal{S}(\rho^{AB})\,,\]where \(\rho^B \seteq \tr_A(\rho^{AB})\). In other words, the entropy of the subsystem is actually bigger than the entropy of the full system.