8

In any regular calculus or real analysis course, we learn the definition of the derivative of a function $f(x)$ as $$f^\prime (x)=\lim\limits_{h\rightarrow 0} \frac{f(x+h)-f(x)}{h}$$ However while studying abstract algebra we come to know that differentiation is just like any operation (like addition, multiplication etc.) but on functions. So I want to know that is there a way to define an algebraic structure with the underlying set as the set of all differentiable functions and the operation of differentiation.
And also if it's possible to define differentiation in such a manner, how to connect it with the analytical definition of differentiation.

  • In $\mathbb{F}[x]$, differentiation is done term-by-term using the power rule: $\frac{d}{dx} x^n = n x^{n-1}$. – Integrand Aug 30 '20 at 18:52
  • 1
    See here for one purely algebraic approach to derivatives of polynomials. – Bill Dubuque Aug 30 '20 at 19:05
  • It seems to me that you may have to expand your abstract algebra knowledge to the realm of topology first, metric spaces, etc., so you can talk about closeness. Thats what youre doing when taking a limit, after all. – CogitoErgoCogitoSum Aug 30 '20 at 21:31

2 Answers2

6

Maybe what you are interested in is the notion of a derivation. Given an algebra (associative, say) $A$, a derivation is a map $\partial:A\to A$ which is additive: $\partial(a+b)=\partial(a)+\partial(b)$ and satisfies the Leibniz rule $$ \partial(ab)=\partial(a)b+a\partial(b).$$ If we give $A$ a ground field $k$, so that it becomes a $k-$algebra, we can also define the similar notion of $k-$derivation of $A$, which satisfies $\partial(\lambda)=0$ for all $\lambda \in k$.

This mimics the situation that occurs in calculus. Indeed, let $C^\infty(\Bbb{R})$ denote the algebra of smooth functions on $\Bbb{R}$ with addition and multiplication defined pointwise. This is an $\Bbb{R}-$algebra under scalar multiplication, and has a natural map $\frac{d}{dx}:C^\infty(\Bbb{R})\to C^\infty(\Bbb{R})$ given by $f(x)\mapsto f'(x)$ as defined in calculus. A few of the basic proofs given in calculus show that $\frac{d}{dx}$ is indeed an $\Bbb{R}-$derivation of $C^\infty(\Bbb{R})$.

In the case of algebraic geometry, one usually only has access to rational functions, i.e. those of the form $\frac{f(x_1,\ldots, x_n)}{g(x_1,\ldots, x_n)}$ for $f,g\in k[x_1,\ldots, x_n]$. Let's just consider the case of $n=1$. In that case there is a natural operator $\partial$ defined as in calculus by $\partial:k[x]\to k[x]$ by specifying $\partial(x^\ell)=\ell x^{\ell-1}$. This definition is formal, in that there are no limits involved, but we know that this agrees with the definition given in calculus so long as we are only interested in polynomial functions. One can then compute $\partial$ on rational functions $f(x)/g(x)$ by proving that they obey the usual quotient rule with respect to $\partial$.

If you are familiar with multivariable calculus then we can use the partial derivatives $\partial_1,\ldots, \partial_n$ to define $n$ (linearly independent) $\Bbb{R}-$derivations of $C^\infty(\Bbb{R}^n)$ by the usual formulas. If you know about manifolds, then you can generalize these ideas quite naturally to that context also.

2

I once went to a seminar on non-commutative geometry, where the following was explained:

Consider a commutative ring $R$ with invertible element $h\in R$ (I gathered that in physical applications $h$ is often Planck's constant). We may take the following quotient of the (non-commutative) polynomial ring in two variables: $$A=R[x,y]/(xy-yx+hy).$$ Let $d\colon A\mapsto A$ be an $R$- linear map satisfying: \begin{eqnarray*}dx&=&y,\\ d(ab) &=& (da)b + a(db), \qquad{\rm for\,\,all\,\,}a,b\in A,\\ d\lambda&=&0,\qquad{\rm for\,\,all\,\,}\lambda\in R. \end{eqnarray*}

One can prove: $$ d(x^n) = \frac{(x+h)^n - x^n}h dx \qquad[1]. $$

Extending linearly, for an arbitrary polynomial $f(x)$ over $R$ we get: $$ df(x) = \frac{f(x+h) - f(x)}hdx. $$ I thought this was a nice result. As far as I understand, the moral was that replacing commutativity (e.g. $[x,dx]=0$) with the identity $[x,dx]=-hdx$, replaces the usual "continuous" differential (where $h\to0$), with this "discrete" one.

Proof of $[1]$:

From the defining relation of $A$ we have $(x+h)y=yx$. Thus $(x+h)^iy=yx^i$ for all natural numbers $i$. The $n=1$ case of $[1]$ is clear. We proceed by induction, by assuming the $n=k-1$ case of $[1]$ and deducing: \begin{eqnarray*} d(x^k)&=&d(xx^{k-1})\\&=& yx^{k-1}+x\frac{(x+h)^{k-1}-x^{k-1}}hy\\&=& yx^{k-1}+\frac{(x+h)^{k}-x^{k}}hy-(x+h)^{k-1}y\\&=& \frac{(x+h)^{k}-x^{k}}hy. \end{eqnarray*}

tkf
  • 11,563