2

If $A$ is a non-singular $n\times n$ matrix, $B$ is an $n\times p$ matrix, and $C$ is a $p\times n$ matrix (where $1\le p \ll n$), how does one prove that the complexity of $$D=A^{-1}(BC)$$ is $\frac{8}{3}N^3+2pn^2+O(n)$ flops?

I have no idea where to start. It is assumed that we do not compute $A^{-1}$ directly, but, instead, we compute the $LU$-factorization of $A$, and use $L$ and $U$ to compute the product $A^{-1}(BC)$ using the method of solving of multiple right-hand sides (whatever this means).

I would appreciate some help with this.

sequence
  • 131
  • 1
  • 4

1 Answers1

5

I'm afraid you can't. As I mentioned in my response to your previous question:

The complexity of a problem is the running time of the fastest algorithm for that problem.

There exist algorithms for this problem whose running time is much less than $N^3$. As I described in my previous answer, there are much more efficient algorithms for matrix inversion (and for matrix multiplication). I linked to algorithms for both.

Therefore, you cannot prove that the complexity is $\frac83 N^3 + \dots$ flops -- because that isn't true.

Given your pair of questions, and since you said you're not sure where to start, I would suggest you start by reviewing the following material here on this site:

Then, read some material on fast algorithms for matrix multiplication and other matrix operations. That'll be a starting point, at least.

If what you want is to understand the running time of LU-factorization of a matrix, that's a different question. If that's your question, you should probably research that, and then post a new question, showing what research you've done.

D.W.
  • 159,275
  • 20
  • 227
  • 470