0

Homework question: given a regular matrix $A\in M^{n\times n}$ and a linear system of equations $A\vec{x} = \vec{b}$, state the preferred method of solving and its computational costs:

  1. $A$ has no special properties.
  2. $A$ has positive eigenvalues.
  3. $A$ is orthogonal.

Solutions that I've been able to find so far:

  1. LU-Decomposition, $\approx \frac{2n^3}{3}$. (Source)
  2. Cholesky-Decomposition, $\approx \frac{n^3}{3}$. (Source)
  3. Orthogonal Diagonalization, depends on matrix multiplication $\approx n^{2.376}.$ (Source)

Are these answers correct, or are there better methods/are the best methods slover?

Zyx
  • 786

1 Answers1

2

General principal is that solving linear equations is no harder than multiplying matrices, regardless of what you know about A (see Burgisser Claussen "Algebraic Complexity Theory" Chapter 15). So the bound is $O(n^{2+\epsilon})$ for an every changing value of $\epsilon$ (by ever changing I mean it changes every few years, we don't know if $\epsilon$ can become arbitrarily small.) But for example at mimimum using vanilla Strassen you get $O(n^{\log_2 7})$ complexity always.

But to be honest, a lot of that is not quite accurate. Missing from this is the issue of numerical stability. If you solve problems over the reals your floating point precision becomes a problem. So the complexity of "practical" algorithms is highly varied.

Algeboy
  • 747