Since I keep hinting that I want to see a certain answer, perhaps I had better just post it.
The OP has shown that for every $\lambda \in \mathbb{R}$, the $\lambda$-eigenspace $E_{\lambda}(T)$ is an $S$-invariant subspace: $S E_{\lambda}(T) \subset E_{\lambda}(T)$. (This holds vacuously if $\lambda$ is not an eigenvector for $T$. Henceforth let's assume it is.) Thus we may consider the restriction of $S$ to $E_{\lambda}(T)$, and if we can show that this transformation has an eigenvector, it is both an eigenvector for $T$ and $S$.
Note that if the scalar field were algebraically closed (e.g. $\mathbb{C}$), then we would automatically have an eigenvector. But since the given scalar field, $\mathbb{R}$, is not algebraically closed, this is not automatic: over any non-algebraically closed field $K$, there are linear transformations of $K^n$ (with $0 < n < \infty)$ without eigenvectors. In fact the OP's assertion works over any scalar field $K$ whatsoever.
The key is the following claim:
If $S: K^n \rightarrow K^n$ is a diagonalizable linear transformation and $W \subset K^n$ is an $S$-invariant subspace, then the restriction of $S$ to $W$ is diagonalizable.
For this I will use the following useful characterization of diagonalizable transformations.
Diagonalizability Theorem: A linear transformation $S: K^n \rightarrow K^n$ is diagonalizable iff its minimal polynomial is squarefree and split, i.e., factors as a product of distinct linear factors.
For a proof, see e.g. Theorem 4.14 of these notes.
Now it is clear that the minimal polynomial of the restriction of $S$ to an invariant subspace divides the minimal polynomial of $S$ and that a monic polynomial which divides a squarefree split polynomial is itself squarefree and split. So applying the Diagonalizability Theorem in one direction and then the other, we see that $S|_W$ is diagonalizable.
This completes the answer to the OP's question. But actually it proves something much stronger: since each eigenspace for $T$ decomposes as a direct sum of simultaneous eigenspace for both $S$ and $T$, in fact all of $K^n$, being a direct sum of these spaces, also decomposes as a direct sum of simultaneous eigenspaces for $S$ and $T$. Taking a basis of simultaneous eigenvectors diagonalizes both $S$ and $T$, so we've shown:
Theorem: Let $S$ and $T$ be commuting diagonalizable linear transformations on a finite-dimensional $K$-vector space (over any field $K$). Then $S$ and $T$ are simultaneously diagonalizable: there is an invertible linear transformation $P$ such that $P S P^{-1}$ and $P T P^{-1}$ are both diagonal.
Finally, recall that a linear transformation is semisimple if every invariant subspace has an invariant complement. The following result shows that this is a "nonsplit version of diagonalizability".
Semisimplicity Theorem: A linear transformation $S: K^n \rightarrow K^n$ is semisimple iff its minimal polynomial is squarefree, i.e., factors as a product of distinct (but not necessarily linear) factors.
This is also part of Theorem 4.14 of these notes.
From this result we can prove (in exactly the same way) the cousin of the first boxed result:
If $S: K^n \rightarrow K^n$ is a semisimple linear transformation and $W \subset K^n$ is an $S$-invariant subspace, then the restriction of $S$ to $W$ is semisimple.
In contrast to the first result, I don't see how to prove this using the characteristic polynomial. And in fact, the argument using the characteristic polynomial shows that the restriction of $S$ to any invariant subspace has an eigenvalue: it does not (directly) show that it is diagonalizable. (In particular, recall that you cannot always tell whether a transformation is diagonalizable just by looking at its characteristic polynomial. So in this sense the minimal polynomial is a "better invariant".) So I think that I have now explained why I prefer this approach.