On reading Axler's Linear Algebra Done Right, Chapter 4, I tried to prove the following theorem:
Theorem. Suppose $a_{0},\dots,a_{m}\in\mathbb{F}$. If $a_{0}+a_{1}z+\cdots+a_{m}z^{m}=0$ for every $z\in\mathbb{F}$, then $a_{0}=\cdots=a_{m}=0$.
Note that, following the convention of the book, $\mathbb{F}$ is either $\mathbb{R}$ or $\mathbb{C}.$
This is my attempt:
Proof. Suppose $a_{0}+a_{1}z+\cdots+a_{m}z^{m}=0$ for all $z\in\mathbb{F}$ ($\dagger$). Let $f\left(z\right)=a_{0}+a_{1}z+\cdots+a_{m}z^{m}$. Then $f\left(z\right)=0$ for all $z\in\mathbb{F}$, that is, $f=0$. Hence, by definition, $\deg f=-\infty$. Suppose there is $j\in\left[0..m\right]$ such that $a_{m}\neq0$. We can choose the maximum of such $j$, let it be $m'\in\left[0..m\right]$. Then $\deg f=m'$. Since $m'\in\left[0..m\right]$, $\deg f\geq0$, which contradicts $\deg f=-\infty$. Therefore, $a_{0}=\cdots=a_{m}=0$.
But it is quite different from the proof in the book and seems simpler.
I suspect there might be a circular argument in my proof, but I could not find a clue. Question: Is this proof correct? If not, what is wrong with this proof?