It is well known (see here, for example) that we have
$$ \psi\left(\frac{1}{2},T\right)=\sum_{p\leq T}\frac{1}{p}=\log(\log T)+A+O\left(\frac{1}{\log T}\right), $$ where $\psi(\sigma,T)=\sum_{p\leq T}p^{-2\sigma}$, $p$ denotes a prime number, and $A$ is some constant.
I'm interested in which $\sigma>\frac{1}{2}$ the sum has essentially the same order (note that I'm not asking about convergence/divergence; clearly it converges, but $\log(\log T)$ might still well approximate the sum).
What I've done
If we put $\sigma=\frac{1}{2}+\omega(T)$, we have $$ \psi(\sigma,T)=\sum_{p\leq T}\frac{p^{-2\omega(T)}}{p}=\sum_{p\leq T}\frac{1-2\omega(T)\log(p)+\cdots}{p}. $$ Thus, if $\omega(T)$ is "nice" enough, we have $$ \psi(\sigma,T)=\log(\log T)+2\omega(T)\log(T)+O(\omega(T)).\tag{$\dagger$} $$ Thus, it would seem that as long as $\omega(T)=O\left(\frac{\log(\log T)}{\log T}\right)$, the order of magnitude is essentially the same.
My Question
Is this correct? Can it be streamlined/improved? I'm staring at a paper of Atle Selberg that makes it seem like $\omega(T)=O((\log T)^{-\delta})$ for some "fixed positive" $\delta$ should hold. While I think at the moment that my work is correct, I'm not convinced it is what he had in mind when he wrote the passage.
($\dagger$) I realize I am sweeping some details under the rug here; if necessary, I can post them, but it seems fruitless to do so at the moment.