3

This is the problem I'm dealing with:

Let $\sigma_1,\dots,\sigma_n \in \mathbb{R}$ and $b_1,\dots,b_n$ be column vectors of length $n$. Consider the system $$ (A - \sigma_jI)x_j = b_j, \quad (j=1,\dots,n).$$ Show this can be solved in $2n^3$ flops neglecting terms of order $n^2$ and lower.

My progress: Since LU decomposition takes $2n^3/3$ flops in general, we can't solve the system $n$ times. My first idea is to somehow link the LU decomposition of $A$ with $A - \sigma_jI$, which we can do with the relationship $$ LU - \sigma_j I = L_jU_j$$ If we define $L_jU_j$ as the LU decomposition of $A - \sigma_j I$. This is where I'm stuck. The expressions we have are the product of $L$ and $U$, which makes it difficult to derive an expression for either of the two. If we have $L,U$ as given, how can we find $L_jU_j$ without doing another decomposition?

BallzofFury
  • 1,154
  • Is anything known about $A$? Is it just real non-symmetric? – Kirill Sep 26 '14 at 10:57
  • Hijacking this question to ask my own (sirry it has abounty so many people view it! :o) ). I think I have a millenium prize question, how do I submit to Harvard? Question is secret until copyright. nuh uh uh. Thanks for the help :o) – Peter Halburt Sep 28 '14 at 00:45

2 Answers2

2

If you find an orthogonal similarity transformation $A=QTQ^t$, then the problem woul become $$ Q(T+\sigma_j)Q^tx_j = b_j, $$ which involves solving equations with the matrix $T+\sigma_j$.

If $T$ comes from the Schur decomposition, then it is upper-triangular (although complex), and solving $(T+\sigma_j)y_j=c_j$ can be done with back-substitution.

You can also compute a $T$ with the real Schur decomposition, which gives a real quasi-triangular matrix $T$: $T$ is zero below the first subdiagonal, and its diagonal consists of $1\times1$ and $2\times 2$ blocks. The $1\times1$ blocks correspond to real eigenvalues, and the $2\times 2$ blocks correspond to the complex eigenvalues. However, computing the real Schur decomposition is a lot more expensive than $2n^3$, although it is still $O(n^3)$. For example, Eigen gives complexity of $25n^3$.

Also note that the orthogonal matrix $Q$ should be kept in a form that allows you to compute $Qx$ and $Q^tx$ efficiently; forming it explicitly is not really necessary. For example, if $Q$ is a product of Householder reflections, you can keep a list of individual reflections.

None of this gives $2n^3$ flops, although it is $O(n^3)$. In fact, at least in this paper and this paper, the authors are unaware of any especially efficient technique for solving shifted systems of linear equations of the kind you have here, and discuss Krylov-type methods. Krylov-type methods can take advantage of the fact that the basis of the Krylov space is invariant under shifts.

Kirill
  • 14,494
0

Here, the decomposition $LU$ has nothing to do. You must diagonalize $A$ or, at least, triangularize $A$ with an orthonormal change of basis. This can be done in $O(n^3)$. The sequel is easy and can be done in $n$ calculations in $O(n^2)$, that is, in $O(n^3)$ again.

Clearly, you work in $\mathbb{C}$ using numerical approximations.

EDIT: @ Kirill , a complexity $\sim 2n^3$ seems to me hopeless. Indeed, the calculation of the eigenvalues of a random $n\times n$ matrix is (I think) at least $\sim 5n^3$. With Matlab on a PC, we obtain the eigenvalues of a $1000\times 1000$ random matrix in 5". cf. fanfan's post in

https://stackoverflow.com/questions/713878/how-expensive-is-it-to-compute-the-eigenvalues-of-a-matrix

It remains to compare with the duration of the calculation of $A^{-1}$ which is $\sim n^3$.