1

If $A$ is a square matrix with eigenvalues $\lambda_1,\dots,\lambda_n$ and $f$ is any polynomial, show that the matrix $f(A)$ has the eigenvalues $f(\lambda_i)$, $i=1,\dots, n$.

I'm not seeing how to show this. First I am confused by the notation $f(A)$. If $f$ is any polynomial then is $f(A) = f(a_{ij}) $? Even so, I am not sure how to get $p(A)x=p(\lambda)x$ from $Ax=\lambda x$.

Burgundy
  • 2,097
  • 1
    If, e.g., $f(z) = z^2 + 3z$ then $f(A) = A^2 + 3A$. Note that $f(z) - f(\lambda_i) = q(z)(z-\lambda_i)$ with some polynomial $q$. – Friedrich Philipp Apr 02 '17 at 22:28
  • 1
    possible duplicate. See http://math.stackexchange.com/questions/492070/how-to-prove-eigenvalues-of-polynomial-of-matrix-a-polynomial-of-eigenvalue – Justine Apr 02 '17 at 23:07

1 Answers1

2

Since $f$ is a polynomial, there exists $n \in \mathbb{N}$ and constants $\{ a_{j} \}_{j=1}^{n}$ such that $f(A) = \sum_{j=1}^{n} a_{j} A^{j}$.

If $\lambda$ is an eigenvalue of $A$, and $\mathbb{v}$ is the corresponding eigenvector, then we have $A\mathbb{v} = \lambda \mathbb{v}$. Notice then that: $$ f(A) \mathbb{v} = \left( \sum_{j=1}^{n} a_{j} A^{j} \right) \mathbb{v} = \sum_{j=1}^{n} a_{j} \left( A^{j} \mathbb{v} \right) = \sum_{j=1}^{n} a_{j} \left( \lambda^{j} \mathbb{v} \right) = \left( \sum_{j=1}^{n} a_{j} \lambda^{j} \right) \mathbb{v} = f(\lambda) \mathbb{v} $$

This means that $\mathbb{v}$ is an eigenvector of the matrix $f(A)$, and that the corresponding eigenvalue is $f(\lambda)$.

NOTE: Since $A\mathbb{v} = \lambda \mathbb{v}$, then we know that $A^{2} \mathbb{v} = A(A\mathbb{v}) = A ( \lambda \mathbb{v} ) = \lambda A \mathbb{v} = \lambda^{2} \mathbb{v}$, and we can generalize this to $A^{j} \mathbb{v} = \lambda^{j} \mathbb{v}$.