You are right to suspect that some conditions on $f$ are required for convergence.
For ease of exposition, I first give a proof assuming rather strong conditions on $f$ and then subsequently relax those conditions.
Let $x$ be a function satisfying $x^{\prime}(t)=f(t,x(t))$ on the real line.
Assuming $x$ is twice continuously differentiable, Taylor's theorem yields
$$
x(t+h)=x(t)+hf(t,x(t))+O(h^{2})
$$
for any $h>0$.
Next, consider the unit interval and subdivide it into $N$ pieces of length $h=1/N$, denoting the boundaries by $t_{n}=nh$.
The Forward Euler scheme is defined by $\mathbf{x}_{0}=x(0)$ and iterates
$$
\mathbf{x}_{n+1}\equiv\mathbf{x}_{n}+hf(t_{n},\mathbf{x}_{n}).
$$
Note, in particular, that $x$ and $\mathbf{x}$ refer to different quantities: the former satisfies an ODE while the latter is the solution of the Forward Euler scheme.
Ultimately, we would like to compare the difference between these two quantities.
Denote this error $\mathbf{e}_{n}\equiv\mathbf{x}_{n}-x(t_{n})$.
Then,
$$
\mathbf{e}_{n+1}=\mathbf{e}_{n}+hf(t_{n},\mathbf{x}_{n})-hf(t_{n},x(t_{n}))+O(h^{2}).
$$
Assuming $f$ is bounded independently of $t$ and $x$, the above implies $\mathbf{e}_{n+1}=\mathbf{e}_{n}+O(h)$ and hence $\mathbf{e}_{n}=nO(h)$ by induction.
Since $n\leq N=1/h$, $\mathbf{e}_{n}=O(1)$.
That is, the error is bounded independently of $n$ (we will tighten this bound below).
Assuming the first spatial derivative $f_{x}$ of $f$ exists and is bounded independently of $t$ and $x$, another application of Taylor's theorem yields
$$
f(t_{n},\mathbf{x}_{n})=f(t_{n},x(t_{n})+\mathbf{e}_{n})=f(t_{n},x(t_{n}))+O(1).
$$
Combining this with the recurrence for the error,
$$
\mathbf{e}_{n+1}=\mathbf{e}_{n}\left(1+O(h)\right)+O(h^{2}).
$$
By induction, $\mathbf{e}_{n}=nO(h^{2})$.
Since $n\leq N=1/h$, we obtain the tighter bound $\mathbf{e}_{n}=O(h)$. That is, the error is linear in the size of the discretization.
The boundedness of $f$ and $f_{x}$ are stronger requirements than necessary.
Indeed, assume $f$ is locally Lipschitz in space, uniformly in time.
Let $A=[-a,a]$ be an interval with $a>0$.
By restricting the spatial argument $x$ of $f$ to the interval $A$, we obtain a function that is (globally) Lipschitz in space, uniformly in time.
In particular, for $x$ in $A$,
$$
\left|f_{x}(t,x)\right|\leq\lim_{h}\frac{Lh}{h}=L.
$$
and
$$
\left|f(t,x)\right|\leq\left|f(t,0)\right|+L\left|x\right|.
$$
If, in addition, $\sup_{t \in [0,1]}|f(t,0)|<\infty$, the above inequalities imply that $f$ and $f_{x}$ are bounded on $[0,1] \times A$.
This is sufficient for our purposes so long as the scheme is stable (that is, $\max_{n}|\mathbf{x}_{n}|$ is bounded independently of $h$).
In this case, the proof given in the first part of this post still works since both $x$ restricted to $[0, 1]$ and $\mathbf{x}$ are bounded (recall that we assumed the continuity of $x$ at the beginning of this post).