I know a set $B$ is dense in $\mathbb{R}$ if an element of $B$ can be found between any two real numbers $a,b$ $s.t.$ $a<b$. I have an inkling that, as $n \rightarrow \infty$, $q=2^n \rightarrow \infty$, and we can play with the values of $\dfrac {p} {q}$ such that for any $a,b \in \mathbb{R}$, $0<b-a< \dfrac {p} {q}$.
Exactly how is a good way to show this? I can start with an interval $(0,1)$ and find a $\dfrac {p} {q}=1/2 $, which is in $(0,1)$. I can shrink the interval and find a $q$ sufficiently large such that $0<b-a< \dfrac {p} {q}$.
I don't think this constitutes a proof but perhaps I'm on the right track? Any suggestions? Thanks!