I have two matrices $A$ and $B$ with quite a few notable properties.
They are both square.
They are both symmetric.
They are the same size.
$A$ has $1$'s along the diagonal and real numbers in $(0 - 1)$ on the off-diagonal.
$B$ has real numbers along the diagonal and $0$'s on the off-diagonal.
So, they look like this:
$$ A= \left[\begin{matrix} 1 & b & ... & z\\ b & 1 & ... & y\\ \vdots & \vdots & \ddots & \vdots \\ z & y & ... & 1 \end{matrix}\right]\\ $$ and $$ B = \left[ \begin{matrix} \alpha & 0 & ... & 0\\ 0 & \beta & 0 & 0\\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & ... & \omega \end{matrix}\right] $$
I need to calculate $(A+\delta B)^{-1}$ many times, with a different value of $\delta$ each time. This can be done directly, but it may be time consuming, depending on the number of $\delta$'s and the size of $A$ and $B$.
If the values along the diagonal of $B$ were $1$, it would be the identity matrix, and it could straightforwardly be co-diagonalized with $A$ so that the inverse of the sum can be calculated by inverting the eigen value. But, alas, that is not the case.
My intuition is that no such matrix algebra shortcut can exist in the scenario under consideration, but I am hopeful that someone can prove me wrong.
edit: I should have provided more information about that. What I really want is a matrix, $M$, such that $MM^{T} = (A + \delta B)^{-1}$. If I can eigen-decompose $A+\delta B$ quickly, then I need only invert the eigen-values ($n$ scalar divisions) and multiply by the eigen vectors ($n$ scalar-vector multiplications) to get $M$.