2

This question had a great answer for efficiently updating the matrix expression

$$(A+gB)^{-1}$$

for new values of the scalar $g$. I have a similar expression, only with an additional $g^2C$ term that I would like to efficiently update for new values of $g$.

$$(A+gB+g^2C)^{-1}$$

My immediate thought was to make the substitution $B_1 = B + gC$ and use the same tactic in a chain, i.e.:

$$(A+gB_1)^{-1} = B_1^{-1} P_1(D_1+gI)^{-1}P_1^{-1}$$

where $P_1$ and $D_1$ come from the eigenvalue decomposition

$$AB_1^{-1} = P_1 D_1 P_1^{-1}$$

and

$$B_1^{-1} = (B + gC)^{-1} = C^{-1}P(D+gI)^{-1}P^{-1}$$

where $P$ and $D$ come from the eigenvalue decomposition

$$BC^{-1} = P D P^{-1}$$

This ended up not being a great route to go down because it requires finding eigenvalue decomposition of $AB_1^{-1}$ for every update of $g$.

Is there another approach that might be more fruitful, or am I just trying to chase something that isn't there?


EDIT:

$A$ ,$B$, and $C$ are all symmetric and complex. $C$ is invertable but $A$ and $B$ aren't necessarily so. If you have a solution that needs to violate any of these constraints, I would still definitely be interested in hearing it.

Jonnie
  • 68
  • Do you really need to invert this matrix, or do you want to solve an equation system? The solution in the other question suggests explicitly calculating $B^{-1}$ and then doing a full eigenvalue decomposition of $AB^{-1}$. So you need to compute all eigenvectors, which is some huge junk of work. So solving for $m$ different $g$'s might be a lot faster. (e.g. if $m<n$, $n$ dimension of the matrices) – P. Siehr Sep 07 '17 at 11:36
  • @p-siehr good point, I probably should have said something about the size of the matrices, the number of $g$s, and why I'm looking for an inverse rather than solving the system of equations. The size of the matrices tend to be around 175x175 and the number of $g$s is in the range of 10,000-20,000. So there is quite a bit to be gained from doing the heavy work of the eigenvalue decomposition up front so I only need to invert a diagonal matrix for each successive $g$ value. – Jonnie Sep 07 '17 at 13:16

1 Answers1

1

Let $P = \begin{bmatrix}C & 0\\0 & I\end{bmatrix}$, $Q = \begin{bmatrix}B & A\\-I & 0\end{bmatrix}$. Consider linear equation problem: $$(gP+Q)\times\begin{bmatrix}x_1\\x_2\end{bmatrix} = \begin{bmatrix}y\\0\end{bmatrix} \tag{1}$$ or equivalently $$\begin{bmatrix}gCx_1+Bx_1 + Ax_2\\gx_2-x_1\end{bmatrix} = \begin{bmatrix}y\\0\end{bmatrix}$$ The second equation gives $x_1 = gx_2$, and from the first equation after substitution we have: $$(g^2C+gB + A)x_2 = y \tag{2}$$

Therefore in order to solve (2) we may solve the problem (1). We already know how to solve (1) efficiently for different $g$.

Pawel Kowal
  • 2,252
  • "We already know how to solve (1) efficiently for different $g$." How is (1) efficient in contrast to (2) for varying $g$? – Algebraic Pavel Sep 09 '17 at 00:00
  • 1
    @Algebraic Pavel For example by performing generalized Schur decomposition of $(P,Q)$. Then in order to solver (1) for different $g$ one needs to solve a triangular linear system, when (2) requires recomputing LU decomposition. Assuming that $A$, $C$ are positive definite we may even reduce (2) to (1) with positive definite $P$ and symmetric $Q$. Then we may perform symmetric generalized decomposition, and in order to solve (1) for different $g$ we need to solve a linear system with a diagonal matrix. – Pawel Kowal Sep 09 '17 at 07:54