Definition
Let $d \in \mathbb{N}$. Let $A \in \mathbb{R}^{d\times d}$ be a PSD matrix. Define the following operator: $T:\mathbb{R}_{+} \times \mathbb{R}^{d\times d} \to \mathbb{R}^{d\times d} $: Let $\{u_1,\dots,u_d\} \subset \mathbb{R}^d$ and $\{\lambda_1,\dots,\lambda_d\}\subset \mathbb{R}$ be the eigenvectors and eigenvalues of $A$. Then, \begin{equation} T_{\lambda_0}(A) = \sum_{i=1}^{d} \max\{\lambda_0,\lambda_i\} u_i u_i^T. \end{equation}
It is not difficult to see that:
\begin{equation} T_{\lambda_0}(A) = \arg\min_{\hat{A}\in \mathcal{S}_{+}^{d} : ~\forall x \in \mathbb{R}^d ~ x^T \hat{A} x \ge \lambda_0 \|x\|_2^2}\|\hat{A}-A\|_F. \end{equation} where $\|\cdot\|_F$ is Frobenius norm.
In other words, $T_{\lambda_0}(A)$ is the Frobenius-norm projection of $A$ onto the the set of PSD matrices with minimum eigenvalue of $\lambda_0$.
Conjecture
For every $\lambda_0>0$, $A \in \mathbb{R}^{d \times d}$, and $B \in \mathbb{R}^{d \times d}$ such that $A$ and $B$ are PSD, we have \begin{equation} \| T_{\lambda_0}(A+B) - T_{\lambda_0}(A) \| \leq \| B \|. \end{equation} where $\|\cdot\|$ denotes the operator norm $\ell_2/ \ell_2$. (Notice that we don't use Frobenius norm here.)
I numerically verified this conjecture over the random draws of $A$ and $B$. However, I don't have any proof. Please let me know if you have any hints!