6

So, I'm reading a linear algebra book (Fraleigh's - self learning) and one of the exercises about Matrix product is to power the matrix:

$$

\left( \begin{matrix}

0 & 1\ -1 & 1

\end{matrix} \right)

$$

to the 2001st power. But can't see how without calculating. Any tip?

I have the "feeling" that given that 1 only changes on its sig then its the same the power of 3 that the power of 2001 but I'm not sure. I'm correct?

  • 7
    Have you tried to calculate the first few powers to see if there's a pattern? – Qiaochu Yuan Aug 26 '11 at 02:53
  • Well indeed I have the "feeling" that given that 1 only changes on its sig then its the same the power of 3 that the power of 2001 but I'm not sure. I'm correct? – Randolf Rincón Fadul Aug 26 '11 at 02:59
  • 2
    The characteristic polynomial of the matrix is $x(x-1)+1 = x^2-x+1$. That means that $A^2 = A-I$ by Cayley-Hamilton, so $A^3 = A(A-I) = A^2-A = A-I-A= -I$. So $A^{2001} = (A^3)^{667}$, hence... – Arturo Magidin Aug 26 '11 at 03:20

3 Answers3

8

Let $A$ be your matrix. It can be calculated that $A^3=-I$ where $I$ is the identity matrix. Then $A^{2001}=(A^3)^{667}=(-I)^{667}=-I$

Shiyu
  • 5,228
6

If $A^n=I$ for a matrix $A$ and natural number $n$, then $A^{kn+r}=A^{kn}A^r=(A^n)^kA^r=I^kA^r=A^r$. Thus, $A^l\equiv A^m$ if $l=m\pmod n$. In the present case, $A^6=I$, and thus $A^{2001}=A^3$, since $2001=3\pmod6$.

joriki
  • 238,052
5

Actually, it's not harder to compute the $n$th power of any $2$ by $2$ matrix:

Let $a,b,c,d$ be complex numbers and consider the matrix $$A=\begin{pmatrix}a&b\\ c&d\end{pmatrix}.$$ Let $n\ge2$ be an integer.

How to compute $A^n$?

Here is a recipe:

Assume first that the roots $u$ and $v$ of the polynomial $$f:=X^2-(a+d)\,X+ad-bc$$ are distinct. The equation of the secant line to the curve $y=x^n$ through the points $(u,u^n)$ and $(v,v^n)$ is $$y=\frac{u^n-v^n}{u-v}\ \ x-uv\ \ \frac{u^{n-1}-v^{n-1}}{u-v}\quad,$$ and we have $$A^n=\frac{u^n-v^n}{u-v}\ \ A-uv\ \ \frac{u^{n-1}-v^{n-1}}{u-v}\ \ I,$$ where $I$ is the identity matrix.

For all nonnegative integer $k$ put $$s_k:=u^k+u^{k-1}v+u^{k-2}v^2+\cdots+v^k.$$ [In particular $s_0=1$.] As $$s_k=\frac{u^{k+1}-v^{k+1}}{u-v}\quad,$$ we have $$A^n=s_{n-1}\,A-u\,v\,s_{n-2}\,I.$$ This formula still makes sense, and is still true, if $u=v$, in which case it reads $$A^n=n\,u^{n-1}\,A-(n-1)\,u^n\,I.$$

Why does this recipe work?

The key point is the easily checked equality $$A^2-(a+d)\,A+(ad-bc)\,I,\quad(1)$$ which enables us to compute $$g(A):=a_n\,A^n+\cdots+a_2\,A^2+a_1\,A+a_0\,I$$ for all polynomial $$g=a_n\,X^n+\cdots+a_2\,X^2+a_1\,X+a_0\in\mathbb C[X]$$ as follows.

Assume again $u\not=v$, and let $h\in\mathbb C[X]$ be the unique polynomial of degree $\le1$ which agrees with $g$ at $u$ and $v$: $$h=g(u)\ \frac{X-v}{u-v}+g(v)\ \frac{X-u}{v-u}\quad.$$ [In particular, the secant line to $y=g(x)$ through $(u,g(u))$ and $(u,g(u))$ is $y=h(x)$.] Then our polynomial $f$, which can be written as $$f=(X-u)(X-v),$$ divides $g-h$. That is, we have $$g(X)-h(X)=f(X)q(X)$$ for some polynomial $q$. On substituting $A$ for $X$, and remembering that $f(A)=0$ by (1), we get $g(A)=h(A)$, or $$g(A)=g(u)\ \frac{A-vI}{u-v}+g(v)\ \frac{A-uI}{v-u}\quad.$$ If $u=v$ we use the tangent line instead of the secant line: $$h:=g(u)+g'(u)\,(X-u),$$ and we get $$g(A)=g(u)\,I+g'(u)\,(A-u\,I).$$

This is the case of $2$ by $2$ matrices. For arbitrary square matrix, see for instance the last part of this answer.

EDIT. The $n$th power of an $r$ by $r$ diagonalizable matrix $A$ is given by the beautiful Lagrange Interpolation Formula: $$A^n=\sum_{i=1}^k\ u_i^n\ \prod_{j\not=i}\ \frac{A-u_j\,I}{u_i-u_j}\quad,$$ where the $u_i$ are the distinct eigenvalues.

Here the justification (which is essentially the same as above). The polynomial $$f:=(X-u_1)\cdots(X-u_k)$$ satisfies $f(A)=0$. Note that $$g:=\sum_{i=1}^k\ u_i^n\ \prod_{j\not=i}\ \frac{X-u_j}{u_i-u_j}$$ is the unique polynomial of degree less than $k$ which agrees with $X^n$ on all the $u_i$. Hence, $X^n-g$ is divisible by $f$, and this implies, as above, $A^n=g(A)$.

  • Wouldn't it be easier just to diagonalize the matrix and be done with it? – Christian Mann Aug 26 '11 at 06:40
  • 1
    @Christian: That's the simplest way I know to compute the $n$th power of $$\begin{pmatrix}a&b\ c&d\end{pmatrix}$$ for arbitrary $n,a,b,c,d$. If you know a simpler one, I'd be very interested to see it. – Pierre-Yves Gaillard Aug 26 '11 at 06:57