5

Question:

Let A = \begin{bmatrix} 1 & 1 \\ 1 & 1 \\ \end{bmatrix}

Find all eigenvalues and eigenvectors of the martrix:

$$\sum_{n=1}^{100} A^n = A^{100} +A^{99} +...+A^2+A$$

I know that the eigenvectors of A are \begin{bmatrix} 1 \\ 1 \end{bmatrix} and \begin{bmatrix} 1 \\ -1 \end{bmatrix} But I do not see any sort of correlation with the sum term and A's eigenvectors.

  • 3
    Try evaluating $(\sum_{n=1}^{100} A^n) \begin{bmatrix} 1 \ 1 \end{bmatrix}$ and do the same with the other eigenvector. What happens? – Giuseppe Negro Dec 11 '18 at 19:24

6 Answers6

3

By linearity, given any polynomial $p$ and matrix $A$, the eigenvectors of $p(A)$ are the same as the eigenvectors of $A$, and the associated eigenvalues are $p(\lambda)$; see this question.

For instance, in this case, if $Av=\lambda v$, then $A^nv=\lambda^nv$, and $(\sum_{n=1}^{100}A^n)v=\sum_{n=1}^{100}(A^nv )=\sum_{n=1}^{100}(\lambda^nv)=(\sum_{n=1}^{100}\lambda^n)v$. Thus, $v$ is an eigenvector with eigenvalue $\sum_{n=1}^{100}\lambda^n$. $A$ has eigenvectors, eigenvalues of $v=\begin{bmatrix} 1 \\ 1 \end{bmatrix} $ $\lambda=2$ and $v=\begin{bmatrix} 1 \\ -1 \end{bmatrix} $ $\lambda=0$. $p(2)$ is a geometric series, so it is $2^{101}-1$. $p(0)$ is just zero. So $p(A)$ has eigenvectors, eigenvalues of $v=\begin{bmatrix} 1 \\ 1 \end{bmatrix} $ $\lambda=2^{101}-1$ and $v=\begin{bmatrix} 1 \\ -1 \end{bmatrix} $ $\lambda=0$

Acccumulation
  • 12,210
2

Hint: If $$A = \begin{bmatrix} 1 & 1 \\ 1 & 1 \\ \end{bmatrix}$$then we have $$A^2 = \begin{bmatrix} 1 & 1 \\ 1 & 1 \\ \end{bmatrix}\begin{bmatrix} 1 & 1 \\ 1 & 1 \\ \end{bmatrix}=\begin{bmatrix} 2 & 2 \\ 2 & 2 \\ \end{bmatrix}\\A^3=\begin{bmatrix} 1 & 1 \\ 1 & 1 \\ \end{bmatrix}\begin{bmatrix} 2&2 \\ 2&2 \\ \end{bmatrix}=\begin{bmatrix} 4&4 \\ 4&4 \\ \end{bmatrix}\\A^4=\begin{bmatrix} 1 & 1 \\ 1 & 1 \\ \end{bmatrix}\begin{bmatrix} 4&4 \\ 4&4 \\ \end{bmatrix}=\begin{bmatrix}8&8 \\ 8&8 \\ \end{bmatrix}\\.\\.\\.\\.$$and you can prove by induction that $$A^k=\begin{bmatrix} 2^{k-1}&2^{k-1} \\ 2^{k-1}&2^{k-1}\\ \end{bmatrix}$$can you finish now?

Mostafa Ayaz
  • 31,924
1

Hint :

Recall the Cayley-Hamilton Theorem (by Wikipedia) :

For a general n×n invertible matrix $A$, i.e., one with nonzero determinant, $A^{−1}$ can thus be written as an $(n − 1)$-th order polynomial expression in $A$: As indicated, the Cayley–Hamilton theorem amounts to the identity : $$p(A) = A^n + c_{n-1}A^{n-1} + \dots + cA + (-1)^n\det(A)I_n = O$$ The coefficients ci are given by the elementary symmetric polynomials of the eigenvalues of $A$. Using Newton identities, the elementary symmetric polynomials can in turn be expressed in terms of power sum symmetric polynomials of the eigenvalues: $$s_k = \sum_{i=1}^n \lambda_i^k = \text{tr}(A^k)$$

Rebellos
  • 21,324
1

You can explicitly compute $\sum_{i=1}^{100}A^i$. First diagonalize $A$, namely rewrite $A$ as $A=PDP^{-1}$.

Now \begin{align} \sum_{i=1}^{100}A^i&=\sum_{i=1}^{100}PD^iP^{-1}\\&=P\left(\sum_{i=1}^{100} D^i\right)P^{-1} \end{align}

Notice that $$(D-I)\left(\sum_{i=1}^{100}D^i\right)=D^{101}-I.$$ SInce $D-I$ is invertible (you can check it) $$\sum_{i=1}^{100}D^i=(D-I)^{-1}(D^{101}-I).$$ Therefore $$\sum_{i=1}^{100}A^i=P(D-I)^{-1}(D^{101}-I)P^{-1}.$$

user9077
  • 1,811
1

It is easy to prove that for $k\in \Bbb{N},$ $$A^k=\begin{bmatrix} 2^{k-1} & 2^{k-1} \\ 2^{k-1} & 2^{k-1} \\ \end{bmatrix}.$$ The sum is $$\Sigma=\begin{bmatrix} 2^{100}-1 & 2^{100}-1 \\ 2^{100}-1 & 2^{100}-1 \\ \end{bmatrix},$$ from where the eigenvalues $0$ and $(2^{101}-2).$

Each matrix $A^k, k=1,\dots,100$ has eigenvalues $0$ and $2^k,$ the corresponding eigenvectors are those of $A:$ $(1,-1)^T, (1,1)^T.$
Thus $(1,-1)^T, (1,1)^T$ are eigenvectors of $\Sigma.$

user376343
  • 8,311
0

Since you have 2 linear independent eigenvectors, $A$ is diagonalizable. You may find useful to replace $A$ in your polynomial expression by its diagonalization because this will simplify the operations you need to do.

Javi
  • 576