2

Simple question, does a differential $dx$ have any meaning composed in a function $f$, such as $\sqrt{dx}$, where $f(x)\neq x$?

Bonnaduck
  • 4,058

4 Answers4

3

Sort of. There is a dual concept to differentials, that of a "tangent vector", which is not unreasonable to think of as a kind of infinitesimal.

While $\mathrm{d}x$ is supposed to denote a differential, many unfortunately use the notation when they wish to speak of an infinitesimal. :(

Anyways, if $f$ is differentiable at a point $a$ and $\epsilon$ is an infinitesimal, then we have

$$ f(a+\epsilon) = f(a) + f'(a) \epsilon $$

Note this is a literal equality and not merely an approximation, as this kind variety of infinitesimal satisfies $\epsilon^2 = 0$.

  • ...In smooth infinitesimal analysis. In Robinson-type nonstandard analysis this is not how it works. – Ian Jul 16 '14 at 16:34
0

When f(x) is the power function xᵗ for a certain t∈ℂ, the expression f(dx) makes perfect sense and is used quite often. Here x is either 1-dimensional (=scalar; moreover, we should know which dx⸣s are “positive”, and which aren’t), or, in multi-dimensional case, dx means the volume form.

The key point is that in these cases, dx stands for the linearization of something else. In particular, dx is an element of a 1-dimensional vector space with orientation — and for elements of such vector spaces, their powers make perfect sense as linear objects.

These are called t-forms, or t-volume forms. (Google for "half-form" "volume form" to see the most frequent examples.)

Remark: A similar thing can be done in complex case — but then one needs to explicitly carry the ambiguity of the sign of the square root of the derivative of coordinate change. (As in Metaplectic group.)


For example, when one writes F(x)dx, one means something which behaves “suitably” under changes of variable: when x=x(y), one can rewrite this as Dx/Dy·F(x(y))dy (with Dx/Dy being the determinant of the differential ∂x/∂y). Likewise, when one writes F(x)(dx)ᵗ, the determinant is replaced by the t⸣th power of the determinant.

(Here either the determinant should be assumed positive, or one should “carry the burden of ambiguity of powers of arbitrary numbers”.)


Examples when this is useful:

  • If you want to consider the L₂-space on a manifold X, it is natural to require that elements of this space to be half-forms: to have the form F(x)(dx)½. Then ∫FG makes perfect sense even when you change variables.

  • When one considers the Sturm–Liouville operator ‒∂²+V(x) on a 1-dimensional manifold, it may be considered as a self-adjoint operator acting from the space of ‒½-forms to the (dual) space of 1½-forms — but this needs the rewriting as ‒∂²+V(x)(dx)². This does not survive all the coordinate changes: the degree=0 term can change to V(x)(dx)² + S(x)(dx)². (What we achieved is that the degree=2 and degree=1 terms do not change under coordinate transforms.)

    However, one can recognize the formula for S(x) as the Schwarzian derivative of the coordinate change; this simultaneously clarifies

    • the fact that Schwarzian derivatives should actually be written as S(x)(dx)²,
    • the geometric meaning of Schwarzian derivatives,
    • the fact that the operator above is preserved by the transformations for which the Schwarzian derivative vanishes (=projective transformations).
  • The expression S(x)(dx)² above is an example of a quadratic differential. Such tensors are dual to vector fields; moreover, expressions c∂²+V(x)(dx)² with a scalar c can be recognized as elements of the dual vector space to the Virasoro algebra (which is a 1-dimensional extension to the Lie algebra of vector fields) — so they “govern” the representation theory of this Lie algebra.

  • Another topic governed by quadratic differentials is the Teichmüller space.

  • modular forms of weight 2k should be written as F(z)(dz)ᵏ. For example, η(z) has weight ½, so in some contexts it is easier to work with η(z)(dz)¼ instead.

0

$$ f'(0)=\frac{f(0+dx) -f(0)}{dx}, \text{ so if }f(0)=0 \text{ then } f(dx)=f'(0)\,dx. $$ However, I'm not sure I've ever seen this usage.

-2

Since $dx=\Delta x$, we may view $f(dx)$ as $f(\Delta x)$.

  • 5
  • 2
    The notion that $dx=\Delta x$, if it makes any sense, holds only in very limited contexts. Yes, lots of calculus textbooks say that. Maybe that's evidence against it. – Michael Hardy Jul 16 '14 at 06:03
  • . . . or, rather, let me put it this way: I hope I have never condoned the practice of considering $dx$ to be the same thing as $\Delta x$. Think of what that implies about $\displaystyle\int_a^b f(x),dx$. It would mean that $f(x),\Delta x$ is within the scope of the integral sign. That is absurd. ${}\qquad{}$ – Michael Hardy Jul 16 '14 at 16:34
  • @MichaelHardy Hola_Mundo's question AND answer here seems to make sense to me and (s)he just defines $dx = \Delta x$. Why does this not work? – got it--thanks Oct 17 '14 at 23:35
  • Because $\Delta x$ is supposed to be a finite change in $x$ and $dx$ is supposed to be an infinitesimal. – Michael Hardy Oct 18 '14 at 03:15
  • It does not do what we want when we say $\lim\limits_{\Delta x\to 0}\dfrac{\Delta y}{\Delta x} = \dfrac{dy}{dx}$, nor when we say $\displaystyle\int_0^1 f(x),dx$, nor when, as a means of proving that $\sin'=\cos$ we say that the infinitely small arc of a circle of length $dx$ moves the point $(\cos x,\sin x)$ to the point $(\cos(x+dx),\sin(x+dx))$ and consider that arc to be a straight line at right angles to the radius from $(0,0)$ to $(\cos x,\sin x)$. ${}\qquad{}$ – Michael Hardy Oct 18 '14 at 03:29
  • $\lim\limits_{\Delta x\to 0}\dfrac{\Delta y}{\Delta x}$ would converge exactly to $\dfrac{dy}{dx}$ by definition. What else do we want it to do? In $\displaystyle\int_0^1 f(x),dx$, it's defined the same way as a Riemann integral. And I would never try to prove $\sin(x) = \cos(x)$ by evaluating differentials or arclength by looking at $(\cos(x +dx),\ \sin(x+dx))$. At least not in standard analysis. Are you considering this from the viewpoint of non-standard analysis? – got it--thanks Oct 18 '14 at 03:36