I'm reading a paper and the following fact is given without proof, and I was hoping one of you smart folks could shed some light on it or provide a counter example:
Consider an infinite dimensional separable Hilbert space $\mathcal{H}$, and let $A$ and $L$ denote two linear, compact operators. Suppose further that $L$ is symmetric and positive definite, so that the spectral theorem gives
$$ L(\cdot) = \sum_{\ell=1}^\infty \lambda_\ell \langle \phi_\ell,\cdot\rangle \phi_\ell. $$
We define a pseudo-inverse of $L$ as
$$ L^{-1}\pi_n(\cdot) = \sum_{\ell=1}^n \frac{ \langle \phi_\ell,\cdot\rangle}{\lambda_\ell} \phi_\ell. $$ ($\pi_n$ is the projection onto the span of $\phi_1,...,\phi_n$). The claim in the paper is that if it is assumed that $$ \sum_{\ell=1}^\infty \frac{ \|A(\phi_{\ell})\|^2}{\lambda_\ell} < \infty, $$ then
$$ \sup_{n\ge 1} \|AL^{-1}\pi_n\|_{op} < \infty. $$
$\|\cdot \|_{op}$ is the usual operator norm. I cannot see why this is true! The assumption seems to imply something about some sort of Trace norm of $AL^{-1}\pi_n$, but if I try to work out what the Trace or Hilbert-Schmidt norms are of $AL^{-1}\pi_n$, I get something like $$ \sum_{\ell=1}^n \frac{ \|A(\phi_{\ell})\|^2}{\lambda_\ell^2}, $$ and assuming $\sum_{\ell=1}^\infty \frac{ \|A(\phi_{\ell})\|^2}{\lambda_\ell^2} < \infty$ is evidently a much stronger condition. Am I missing something simple as to why the condition implies the operator norms are uniformly bounded?