This problem is very well documented here at the MSE, e.g. Expansions of a Real Number $x$
Let $p$ be a natural number greater than $1$, and let $x$ be a real number with $0\leq x\leq 1$. Show that there is a sequence of integers $\{a_{n}\}$ with $0\leq a_{n}<p$ for each $n$ such that $$x=\sum_{n=1}^{\infty}\frac{a_{n}}{p^{n}}$$ and that the sequence is unique except when $x$ is of the form $\frac{q}{p^{n}}$, $0<p<q$, in which case there are exactly two such sequences.
But I am struggling to understand something even more basic that those answers, because I don't really get why if $x = \frac{q}{p^n}, $ then there are two expansions for $x$. I don't see why this takes into account the two decimal expansions of 1 in base 10, for example (Characterization of non-unique decimal expansions). So I am lost at seeing why taking $x = \frac{q}{p^n}$ generalizes that behaviour of the 2 possible expansions.