Here is the TLDR:
Conjecture. Fix $\lambda_*>0$ and $d \in \mathbb{N}$. Define $c_* : \mathbb{R}^{d \times d} \to \mathbb{R}^{d \times d}$ be the orthogonal matrix projection (in Frobenius norm) onto the set $\{M = M^T \in \mathbb{R}^{d \times d} : \forall_{v \in \mathbb{R}^d} v^T M v \ge \lambda_* \|v\|^2\}$. Let $A \in \mathbb{R}^{d \times d}$ be positive semidefinite and let $x \in \mathbb{R}^d$ be a unit vector. Then $$\left\| c_*(A+xx^T)^{-1} - c_*(A)^{-1} \right\| \le \frac{\lambda_*^{-2}}{1+\lambda_*^{-1}}.$$
The best bound I can prove so far is (assuming $\lambda_*>1$) $$\left\| c_*(A+xx^T)^{-1} - c_*(A)^{-1} \right\| \le \frac{\lambda_*^{-2}}{1-\lambda_*^{-1}}.$$
Now I'll explain where this is coming from:
Let $A \in \mathbb{R}^{d \times d}$ be a (symmetric) positive semidefinite matrix with minimum eigenvalue $\lambda_\min =\inf_{v \in \mathbb{R}^d : \|v\|=1} v^TAv$. Let $x \in \mathbb{R}^d$ be a unit vector (in Euclidean norm).
Suppose $\lambda_\min >0$, which implies $A$ and $A+xx^T$ are both invertible. I'm interested in bounding $(A+xx^T)^{-1}-A^{-1}$ (in operator norm). Using the Sherman-Morrison formula we have $$\left\|(A+xx^T)^{-1}-A^{-1}\right\| = \frac{\|A^{-1} x x^T A^{-1}\|}{1+x^T A^{-1} x} = \frac{\|A^{-1}x\|^2}{1+x^TAx} \le \frac{\lambda_\min^{-2}}{1+\lambda_\min^{-1}},$$ where the final inequality follows by noting that the expression is maximized by setting $x$ to be an eigenvector corresponding to the minimum eigenvalue $\lambda_\min$ -- i.e., $A^{-1}x=x/\lambda_\min$.
This bound is tight, but it's not very good if $\lambda_\min$ is small (or zero). I need to control this in my application. I want to fix this by perforce ensuring $\lambda_\min \ge \lambda_*$. Here $\lambda_*$ is some value I pick.
Define $c_* : \mathbb{R}^{d \times d} \to \mathbb{R}^{d \times d}$ as follows. Let $A = \sum_i \lambda_i v_i v_i^T$ be the eigendecomposition of $A$ (i.e., $v_1, \cdots, v_d \in \mathbb{R}^d$ are orthonormal). Then $c_*(A) = \sum_i \max\{\lambda_i,\lambda_*\} v_i v_i^T$. Equivalently, $c_*$ is the projection (in Frobenius norm) onto the set of all symmetric matrices with minimum eigenvalue $\ge \lambda_*$.
Now I want to bound $\left\| c_*(A+xx^T)^{-1} - c_*(A)^{-1} \right\|$. Ideally, I want to get the same bound as before but with the enforced minimum eigenvalue -- i.e., $\frac{\lambda_*^{-2}}{1+\lambda_*^{-1}}$. Unfortunately, I cannot apply the Sherman-Morrison formula directly because $c_*(A+xx^T) - c_*(A)$ may not be rank-1 anymore. There are generalizations of Sherman-Morrison, but I don't know how they interact with the enforced minimum eigenvalue.
As a footnote, another way to enforce a minimum eigenvalue is to use $A+\lambda_*I$ instead of $c_*(A)$. In this case the Sherman-Morrison formula still works and gives the desired bound. However, this doesn't work well in my application. Often $\lambda_\min \ge \lambda_*$. In this case, I don't want to perturb $A$, i.e. $c_*(A)=A$. This is a nice property that breaks if we add $\lambda_* I$ instead.