3

I know that for a matrix to be diagonalizable, the eigenvectors of its eigenvalues must be linearly independent. However, I am unable to prove the theorem in the title.

  • 2
    Hint: Show that eigenvectors for distinct eigenvalues are linearly independent. Then write the matrix in the basis of eigenvectors. – Jair Taylor Jul 10 '19 at 20:18

3 Answers3

1

It is clear that the eigenvalues of this matrix are listed in the diagonal entries. To see this, consider the matrix $\lambda I-A$ which is also triangular.

Now this matrix must have $n$ different eigenvalue and must correspond to $n$ different eigenvectors. The linear independency could be proved by using linearity of matrix multiplications.

Book Book Book
  • 625
  • 3
  • 8
1

To clarify, since the matrix is upper triangular, it is already in Schur form and thus its eigenvalues are on the diagonal; every matrix A is orthonormally similar to an upper triangular matrix U with the eigenvalues on the diagonal s.t. $A = QUQ^T$, which is known as the Schur form of a matrix. Then, as mentioned in another comment, you just need to prove that eigenvectors of distinct eigenvalues are linearly independent (see here: How to prove that eigenvectors from different eigenvalues are linearly independent). Diagonalizability obviously follows from this.

  • Hey Zach, I don't understand what T represents in the best proof for the question in the link you gave me. – Sanjay Chintapally Jul 10 '19 at 21:17
  • Sorry about that Sanjay, perhaps I should have written the proof in my own words. T is the matrix with eigenvectors v1 and v2, with corresponding eigenvalues lambda1 and lambda2. This is why T(alpha1v1) = alpha1lambda1v1 and T(alpha2v2) = alpha2lambda2v2. These steps are combined in the proof in the link. I hope this helps and let me know if you are still stuck. – Zach Favakeh Jul 10 '19 at 21:21
  • How could you prove that T(α1 x v1) = α1 x λ1 x v1? – Sanjay Chintapally Jul 10 '19 at 21:43
  • I'm assuming you mean T looks like this - [v1 v2] where v1 and v2 are 2-vectors? – Sanjay Chintapally Jul 10 '19 at 21:44
  • That's the definition of the eigenvalue/eigenvector: $Ax=\lambda x$ You can throw a scalar in front of it if you want $T \alpha x = \alpha \lambda x$ where lambda is the eigenvalue and x is the eigenvector of T. No, T=[v1 v2] is not necessarily what I meant. I just meant v1 and v2 are eigenvectors of T such that $T \alpha_1 v1 = \alpha_1 \lambda_1 v1$ and $T \alpha_2 v2 = \alpha_2 \lambda_2 v2$ The lengths of v1 and v2 and the size of T are all arbitrary. – Zach Favakeh Jul 10 '19 at 21:45
  • Ah... so you mean that (α1 x v1) is the eigenvector of the matrix T whose eigenvalue is λ1? – Sanjay Chintapally Jul 10 '19 at 21:53
  • Almost. v1 itself is the eigenvector of the matrix T, and the corresponding eigenvalue to v1 is $\lambda_1$. $\alpha_1$ is simply a constant, a scalar which both sides of the equation may be multiplied by. Remember, every multiple of an eigenvector is still the same eigenvector. Usually, the eigenvector just means that the 2-norm of the eigenvector has been normalized to equal 1. – Zach Favakeh Jul 10 '19 at 21:56
-1

Given the matrix $A$ is a $k × k$ upper triangular matrix with distinct diagonal entries, $a_1, ~a_2, \cdots, ~a_k$.

The determinant of an upper triangular matrix is the product of its diagonal entries.

So $$f(t)= \det(A-tI)=(a_1-t)(a_2-t)\cdots(a_k-t) $$ Setting that to $~0~$, your $~k~$ eigenvalues are all distinct with multiplicity of $~1~$.

The dimension of your eigenspace must be $~1~$.

So by the test for diagonalizability, since your characteristic polynomial splits and the dimension of your eigenspace is equal to the algebraic multiplicity for each eigenvalue, $A$ is diagonalizable.

nmasanta
  • 9,222