8

If $F$ is a field (one can do it in any ring, but lets restrict ourselves to fields), then we can define a formal derivative of a polynomial $$ f(x)=a_nx^n+a_{n-1}x^{n-1}+\cdots + a_1x+a_0 \in F[x]$$ as $$ D_xf(x)=na_nx^{n-1}+(n-1)a_{n-1}x^{n-2}+\cdots+2a_2x+a_1 \in F[x].$$ With this definition, one can prove the usual rules for the derivative of the sum and product of polynomials, they are \begin{equation}\tag{1} D_x(f(x)+g(x))=D_x(f(x))+D_x(g(x)) \end{equation} and \begin{equation}\tag{2} D_x(f(x)g(x))=D_x(f(x))g(x)+D_x(g(x))f(x). \end{equation}

From the product rule we can deduce that $D_x(\lambda f(x))=\lambda D_x(f(x))$ for any $\lambda \in F$ and $f(x) \in F[x],$ so the derivative is a linear map.

Question

I was wondering if the relations (1) and (2) does in fact characterize the formal derivative of a polynomial, at least in fields such as $\mathbb{R}$ and $\mathbb{C}$. I mean, if I have a linear map $L:\mathbb{C}[x]\to\mathbb{C}[x]$ (or $L:\mathbb{R}[x]\to\mathbb{R}[x]$) that satisfies (1) and (2), does it necessarily be the formal derivative?

Also, I was wondering if this can be generalized to general linear maps between smooth functions from $\mathbb{R}^n$ to $\mathbb{R}^n$ (or from $\mathbb{R}$ to $\mathbb{R}$), so that if $L:C^{\infty}(\mathbb{R}^n) \to C^{\infty}(\mathbb{R}^n)$ is a linear map that satisfies (1) and (2), then $L$ must be the derivative. If not, I was wondering if there is some other conditions that $L$ must satisfy to guarantee that is must be the derivative.

I appreciate any comments on the problem.

Note: I had a little problem with the tags of this question, as I asked by formal derivative between fields but also consider the question to general linear maps between $\mathbb{R}^n$ and $\mathbb{R}^n$. I appreciate if someone could help me with the tags too.

Rob Arthan
  • 48,577
positron0802
  • 5,652
  • 1
    In (https://en.wikipedia.org/wiki/Generalizations_of_the_derivative), it is said, in paragraph "Derivatives in algebra" that "generalizations of the derivative can be obtained by imposing the Leibniz rule of differentiation [i.e. (uv)'=u'v+uv'] in an algebraic structure, such as a ring or a Lie algebra". – Jean Marie Jan 28 '17 at 20:22
  • In the case of $\mathbb{R}$, https://math.stackexchange.com/questions/4587371/is-there-any-simple-set-of-properties-that-uniquely-characterizes-differentiatio and https://math.stackexchange.com/questions/4588139/is-there-a-simple-way-to-characterize-the-smooth-functions-without-using-the-der show that it is true if we can assume scalar multiplication. – mathlander Dec 20 '22 at 18:59

1 Answers1

3

In general, a $\mathbb{C}$-linear function $L\colon\mathbb{C}[x]\to\mathbb{C}[x]$ satisfying $$L[f(x)g(x)] = f(x)\,L[g(x)] + L[f(x)]\,g(x)$$ is called a derivation. (This definition makes sense for any algebra over a field.) Note that such a map must satisfy $L[1] = 0$, since $$ L[1] \,=\, L[1\cdot1] \,=\, 1\,L[1] + 1\,L[1] \,=\, 2\,L[1]. $$ However, not every derivation satisfies $L[x] = 1$.

Indeed, if $k(x)$ is any (fixed) polynomial in $\mathbb{C}[x]$, we can define a derivation $L\colon \mathbb{C}[x]\to\mathbb{C}[x]$ by $$ L[f(x)] \,=\, k(x)\,f'(x). $$ It is easy to check that $L$ is a derivation, and that $L[x] = k(x)$. It is also easy to check that every derivation must be of this form, since any two derivations $L_1$ and $L_2$ satisfying $L_1[x]=L_2[x]$ must be equal by induction on the degree of a polynomial.

Jim Belk
  • 49,278