I'm writing a proof that the real numbers are uncountable. The proof proceeds by contradiction, assuming $f: \mathbb{N} \to \mathbb{R}$ is a bijection. Then $f(n) = a_0. a_1 a_2 a_3 \ldots$. For each position after the decimal point, it defines the set $S_n$ to be the number of elements that exist in all possible decimal expansions of $f(n)$. The claim is that $|S_n| \leq 2$.
I cannot figure out how to prove this rigorously, and I feel my knowledge of this problem is very superficial. Some real numbers, like $\pi$, don't terminate at all or repeat, so they have exactly one decimal expansion. A number like $0.99\overline{9}$ have two decimal expansions, namely $0.99\overline{9} = 1$. I can also take a decimal that terminates like $0.234$ and rewrite it as $0.23\overline{9}$. (I'm not sure if an infinite string of $1$'s corresponds to a $2$, infinite string of $3$'s corresponds to a $4$, and so forth, or this is particular to $9$'s).
How can I justify this fact?