There are many answers on this site where it is proven that, if a function $f:\Bbb R\to \Bbb R$ satisfying the functional equation $f(x+y)=f(x)f(y),\forall x,y\in\Bbb R$ is continuous and nowhere $0,$ then $f(x)=a^x$ for some $a>0,$ but how can one prove a monotonic function $f$ can be uniquely determined by $f(x+y)=f(x)f(y)$ and $f(1)=a$ for a given $a>0$?
Some of the useful posts I've already seen regarding the problem with assumed continuity:
One thought that appeared to me was to use the following theorem at some point:
Let $I\subseteq\Bbb R$ be an open interval, $f:I\to\Bbb R$ a monotonic function on $I$ and $I'=f(I)$ an open interval. Then $f$ is continuous on $I$.
I thought, if I could prove if $f$ satisfying $f(x+y)=f(x)f(y),\forall x,y\in\Bbb R, f(1)=a>0$ and being monotonic implies $f(\Bbb R)$ is an open interval, that would imply $f$ is continuous and hence uniquely determined by the given conditions, however, it appeared more complicated than the initial problem.
How should I tackle this problem, given that $f$ is monotonic, without assuming $f$ is continuous?