6

Given that $C=I+A+A^2+A^3+ \ldots$ Prove that I-A is the inverse of $C$

Hint: Use the infinite series technique for finding inverse of a matrix.

Now I know with an infinite geometric series with a limiting sum:

$A+AR+AR^2+AR^3+...=A/(1-R)$ where $|R|<1$

And if we let A=1 and R=x then the above expression becomes:

$1+x+x^2+x^3+...=1/(1-x)=(1-x)^{-1}$

This suggests that for matrices we can develop a corresponding formula, provided that $A$ is “sufficiently small”:

$I+A+A^2+A^3+...=(1-A)^{-1}$

This means that if we are asked to find the inverse of a matrix B, we find A such that $B = I – A.$ We can then approximate $B^{-1}$ by summing as many terms as we like of the geometric series $I+A+A^2+A^3+...$

But we're dealing with matrices so I don't know how to go about proving this... I appreciate any help guys, but it'd be great if there was a way to prove it with minimal series/sequences knowledge thanks...

Fraïssé
  • 11,275
Ali
  • 71

4 Answers4

11

This only works if the series converges.

If $\|A\|<1$ and $\|\cdot\|$ is a submultiplicative norm, then the result follows from a geometric series approach.

We have $\| \sum_{k=0}^n A^k \| \le \sum_{k=0}^n \|A\|^k \le {1 \over 1 - \|a\|}$, hence the series converges.

We have $(I-A) \sum_{k=0}^n A^k = I-A^{n+1}$, hence letting $n \to \infty$, we have $(I-A) C = (I-A) \sum_{k=0}^\infty A^k = I$.

copper.hat
  • 172,524
3

Hint What is $AC$? What happens if you subtract $C-AC$? Note that $C-AC=(I-A)C$.

N. S.
  • 132,525
1

OKAY IM NOT SURE IF THIS CAN BE CALLED A 'PROOF' BUT HERE IS WHAT I HAVE TRIED.. CORRECT ME IF I AM WRONG ANYWHERE GUYS THANKS:

Take the sum of an infinite geometric series; lets call it S where the initial term is 'a' and the common ration between terms is 'r'. Hence we have S=a+ar+ar^2+ar^3+....

Therefore: rS=ar+ar^2+ar^3+ar^4+...

Therefore: S-rS=a i.e: S=1/(1-r)

Similarly, in C=I+A+A^2+A^3+... we have the initial term as 'I' and the common ration between terms as 'A'

Therefore, applying the earlier derived formula, we get C=I/(1-A)=1/(1-A)=(1-A)^-1 ( 1-A is only well defined if the determinant of 1-A is not zero )

Therefore, if C=(1-A)^-1 then the inverse of C, C^-1, is ((1-A)^-1)^-1=(1-A) as required.

Ali
  • 71
0

If $norm(A)<1$ then the series $C=1+A+A^2+A^3+...$ converges to $C=\frac{1}{(I-A)}$ which is the definition of inverse of $C$ i.e. $C=(I-A)^{-1}$

It is kind of like geometric series.