Given a real symmetric matrix $A$, say we already know its spectral decomposition $A = UDU^T$. We wish to compute the spectral decomposition of $A+E$, where $E$ is a small matrix (I am particularly interested in the case where E is nonzero in only one entry).
I know this places us squarely in the domain of perturbation theory, which is of course a richly studied topic. However, I am interested in an algorithm which can numerically solve this quickly, and which I could quickly implement or find prepackaged in Matlab or Python. I am sure such a thing exists but I am having trouble finding it. Can anyone provide me with a reference to such an algorithm?
For context, I need to compute the eigenvectors of $A$, then perturb it a tiny bit, and then recompute the eigenvectors, and then repeat many times. Recomputing the vectors from scratch each time is slow and seems wasteful.
If $E$ has one non-zero entry it's rank-one. The author lists the Netlib routine. I don't think it'd be right to copy it.
http://www.netlib.org/lapack/explore-html/d2/d24/group__aux_o_t_h_e_rcomputational_ga3c4a943599132aea3ac964c08392853a.html
– Ryan Howe Dec 19 '20 at 19:58