The matrix,
\[C = \begin{pmatrix} -1 & -5 \\ 4 & 7 \end{pmatrix}\]The matrix,
\[B = \begin{pmatrix} 1 & 1 & 1 \\ 1 & 1 & 1 \\ 1 & 1 & 1 \end{pmatrix}\]
This matrix does not diagonalize over \(\mathbb R\), but it does over \(\mathbb C\) as,
\[C = \begin{pmatrix}-2-i & -2+i \\ 2 & 2\end{pmatrix} \begin{pmatrix}3-2i & 0 \\ 0 & 3+2i\end{pmatrix} \begin{pmatrix}-2-i & -2+i \\ 2 & 2\end{pmatrix}^{-1}\]This matrix diagonalizes over \(\mathbb R\) as,
\[B = \begin{pmatrix}-1&-1&1\\0&1&1\\1&0&1\end{pmatrix} \begin{pmatrix}0&0&0\\0&0&0\\0&0&3\end{pmatrix} \begin{pmatrix}-1&-1&1\\0&1&1\\1&0&1\end{pmatrix}^{-1}\]
Given the following information about the \(4 \times 4\) real matrix \(A\), answer whether: (I) \(A\) diagonalizes, (II) \(A\) does not diagonalize, or (III) there is not enough information to tell.
\(A\) is invertible.
\(A\) has 4 distinct real eigenvalues.
The two eigenspaces of \(A\) are 3-dimensional and 1-dimensional.
\(A^{5} = 0\).
There is not enough information (III). For example,
\[\begin{pmatrix}2 & 0 \\ 0 & 2\end{pmatrix}\]diagonalizes (in fact, it is diagonal), but
\[\begin{pmatrix}2 & 1 \\ 0 & 2\end{pmatrix}\]does not, although both of these matrices are invertible.
Yes, it diagonalizes (I). If a matrix has 4 distinct eigenvalues, it has at least 4 linearly independent eigenvectors, which is all that a \(4 \times 4\) matrix needs to be diagonalizeable.
Yes, it diagonalizes (I). If an eigenspace has dimension 3, that means there are at least 3 linearly independent eigenvectors, combined with the one other linearly independent eigenvector from the other 1-dimensional eigenspace that makes 4 eigenvectors (which is all that is needed for a \(4 \times 4\) matrix to diagonalize).
There is not enough information (III). All that we can conclude from that \(A^5 = 0\) is that all of the eigenvalues of \(A\) are zero: If \(\lambda \ne 0\) were an eigenvalue with eigenvector \(\vec v\), what is \(A\vec v\)? What is \(A^5 \vec v\)? Why is this impossible? But this does not give enough information about the eigenvectors to know whether or not \(A\) diagonalizes.
Show that the following matrix is orthogonal:
\[M = \begin{pmatrix} \tfrac{1}{\sqrt 3} & \tfrac{1}{\sqrt 3} & -\tfrac{1}{\sqrt 3} \\ \tfrac{1}{\sqrt 2} & -\tfrac{1}{\sqrt 2} & 0 \\ \tfrac{1}{\sqrt 6} & \tfrac{1}{\sqrt 6} & \tfrac{\sqrt 2}{\sqrt 3} \end{pmatrix}\]Note: A prior version of this question had a typo—if you prove that that old matrix was NOT orthogonal, you are also eligible for full credit!
To check that a matrix is orthogonal, you just have to check that its inverse is its transpose. It’s hard to find an inverse in general (Gaussian elimination) but easy to check whether or not a matrix \(B\) is an inverse of the matrix \(M\): just check if \(MB = I\) or not.
So, we check if \(M^T\) is the inverse of \(M\) by computing \(MM^T\); we find that \(MM^T = I\), so \(M^T = M^{-1}\), meaning that \(M\) is orthogonal.
A matrix \(Q\) has characteristic polynomial \(\det(\lambda I - Q) = p_Q(\lambda) = \lambda^3 - 2\lambda^2 + \lambda + 5\).
Calculate \(\det(Q)\).
\(Q\) has exactly one real eigenvalue. Does \(Q\) diagonalize over \(\mathbb C\)?
Since the largest power of \(\lambda\) is 3, we deduce that \(Q\) is a \(3 \times 3\) matrix. We plug \(\lambda = 0\) into the characteristic polynomial to compute,
\[5 = \det(0I - Q) = \det(-Q) = (-1)^3\det(Q),\]so \(\det(Q) = -5\).
Because complex eigenvalues come in pairs, we know that \(Q\) has one real eigenvalue and two distinct complex eigenvalues. So \(Q\) has three distinct eigenvalues, which means that it must diagonalize over \(\mathbb C\).
A matrix \(R\) has characteristic polynomial \(\det(\lambda I - R) = p_R(\lambda) = \lambda^2 - 7\lambda + 12\).
Calculate \(\det(R)\).
Calculate \({\rm tr}(R)\).
Unlike question 4, we can actually factor this characteristic polynomial as \((\lambda -3)(\lambda - 4)\), whence the eigenvalues of \(R\) are \(\lambda = 3,4\). The determinant is the product of the eigenvalues, \(\det(R) = 3(4) = 12\).
The trace is the sum of the eigenvalues, \({\rm tr}(R) = 3+4 = 7\).