An important problem is the online computation of the eigenvalues of a covariance matrix $\Sigma_T=\sum_{t=1}^T x_tx_t^\top$. After some searching, as far as I can tell, it is not possible to efficiently compute the spectrum of $\Sigma_{T+1}$ after observing a new datapoint $x_{T+1}$, given any decompositions of $\Sigma_{T}$. In other words, there is no "incremental SVD" or Woodbury-type identity for SVD -- most things which use the name incremental SVD in the literature seem (in my search) to be approximate algorithms or approximate updates to the thin SVD.
What's weird is there seems to be some barrier here: for instance, there is an efficient low-rank update to the Cholesky decomposition, but the spectrum of $\Sigma_{T+1}$ is not easy to read off from its Cholesky decomposition. Same goes for the QR decomposition. The spectrum would be easy to read off from the Schur decomposition, but there does not seem to exist an efficient low-rank update for the Schur decomposition.
So, I have 2 questions: first, is my impression that there does not exist an "incremental SVD" accurate? Second, is there an intuition for why this problem should be so hard?