Find the eigenvalues and their corresponding eigenvectors for the matrix
(Hint. One of the eigenvalues is , and you can use polynomial long division to factor the characteristic polynomial of . You are also welcome to use a calculator to find the roots of the characteristic polynomial, but all other work must be shown.)
This is a pretty tricky characteristic polynomial to compute, but the rest is the same work as always. To find the eigenvalues, we’re trying to solve,
So we get the eigenvalues .
By solving the equation for our eigenvalues, we find the eigenvector
for the eigenvalue , and we find the linearly independent eigenvectors
for the eigenvalue .
Find the eigenvalues and their corresponding eigenvectors for the matrix
This matrix has a nice characteristic polynomial. To find the eigenvalues, we solve
So we get the eigenvalues .
By solving the equation for our eigenvalues, we find the eigenvector
for the eigenvalue , we find the eigenvector
for the eigenvalue , we find the eigenvector
for the eigenvalue , and we find the eigenvector
for the eigenvalue ,
Find the eigenvalues and their corresponding eigenvectors for the matrix
(Hint. They may be complex.)
To find the eigenvalues for this matrix, we have to allow for complex solutions. We’re solving the equation,
So we get the eigenvalues .
By solving the equation for our eigenvalues, we find the eigenvector
for the eigenvalue , and we find the eigenvector
for the eigenvalue .
Any matrix looks like, > > > > and has characteristic polynomial > > > > If we plug in for , we find: > > > > So, surprisingly, is the zero matrix.
(Bonus) Let be a matrix, and its characteristic polynomial. Based on your answer to part (a), what do you think might be? Can you think of reasons your answer makes sense?
Surprisingly, our answer for (a) is true for any matrix :
Two linear transformations, and , have the same eigenvalues with multiplicities. Additionally, every eigenvector of for eigenvalue is also an eigenvector of for eigenvalue .
Are and the same linear transformation? Explain or give a counterexample.
This question was trickier than intended, so I didn’t count its points in the total for this assignment. Also, I gave credit if you wrote something down for this problem indicating that you thought about it.
Certainly, this is true if and both have a full set of eigenvectors. This means that there is the same basis of eigenvectors for and so that they both diagonalize as the same diagonal matrix of eigenvalues
If we were just thinking of and as matrices, and they don’t have enough eigenvectors, they might not be the same. For example, the matrices
have the same eigenvalues and the same eigenvector, but are different matrices. If they are matrix representations of and for the standard basis, then we could say and are different linear transformations. But what if we could change bases? If the second matrix were instead a representation for the linear transformation in the basis
then the two linear transformations are the same.
We don’t even need to resort to worrying about bases, and it turns out that it’s still not true that two linear transformations are the same in this case, provided they still do not have a full set of eigenvectors. Consider the linear transformations who, for the standard basis, have matrix representations:
and,
Then and both have the eigenvalues and have for eigenvectors only the standard vectors
However, while (in fact, , but the point remains) so that they cannot represent the same linear transformation, even allowing for the changing of basis.