0

I've been working on proof for a while that integration and antidifferentiation are equivalent except for a constant difference, below is an outline of that proof so far. The are a couple parts that I feel are on some shifty ground and not so rigorous, such as whether I actually proved my claim, using tangents as approximators for continuous functions, arguing about the meaning of $c$ in terms of constants of integration, and finally how to argue for changing the bonds of the sum in the second to last paragraph. Sorry if the spacing changes make this hard to read, I can remove them if its too bad.

Given a continuous real function $f$ we can define another continuous real function $g$ which goes through the point $(0,c)$ and $\forall d\in\Bbb{D}$ $g'(d)=f(d)$ where $\Bbb{D}$ is the domain of $f$.

From here on out I going to take a little bit of a strange path, but in the end the purpose should be clear. Given our definition for $g$, the tangent line at any point $(a,g(a))$ along $g$ will have the equation $y=f(a)(x-a)+g(a)$, but we are limited in using this formula to find tangents because the value of $g(a)$ is unknown, except in one special case, when $a=0$.

When $a=0$ we know $g(a)$ is equal to $c$ by the definition of $g$. That gives us the line tangent to $g$ at $(0,g(0))$ as $y=f(0)x+c$. Since both $g$ and $f$ are continuous this tangent is a reasonable approximation for $g$ around $0$ for some small distance $h$. That allows us to approximate $g(h)$ as $f(0)h+c$ with the equation of our tangent line. Now, with this approximation for the value of $g(h)$ and we can approximate the tangent line through $(h,g(h))$ as $f(h)(x-h)+c+f(0)h$ If we repeat this process in general the tangent line for $g$ at some point $a$ can be approximated as $\displaystyle f(a)(x-a)+c+\sum_{n=0}^{a/h-1} f(nh)h$.

In general the tangent at $(a,g(a))$ can be used as an approximation for $g(a)$ if we plug in $a$ for $x$ in the equation of the tangent line. Doing some quick algebra that gives us $\displaystyle c+\sum_{n=0}^{a/h-1} f(nh)h$. Now you may begin to recognize this sum, but there are a couple more steps before the proof is finished.

The part of this sum that puts a restriction on how accurately it approximates $g(a)$ is the size of $h$, so in order to get the best approximation we should consider the $\displaystyle\lim_{h \to 0}$, so that our new sum is $\displaystyle c+ \lim_{h \to 0} \sum_{n=0}^{a/h-1} f(nh)h$. We can actually simplify this sum by changing the upper from $\frac{a}{h}-1$ to $\frac{a}{h}$ since the only piece of the sum that is lost is the term $\displaystyle\lim_{h \to 0}f(a-h)h$ and since the limit of a product can be rewritten as the product of the limits, namely $\displaystyle\lim_{h \to 0} f(a-h) \times\displaystyle \lim_{h \to 0}h$. Since $f$ is continuous, the first limit evaluates to $f(a)$, and the second limit evaluates to $0$, meaning that the value of the product is just $f(a) \times 0$, or just $0$, meaning that the removal of this term causes no change to the value of the sum in the limiting case. That gives us our new sum-limit as $\displaystyle c+\lim_{h \to 0} \sum_{n=0}^{a/h} f(nh)h$.

The last bit to consider is as $h$ becomes smaller more and more values of $f$ are being added over the interval $[0,a]$, so we can rewrite the sum as $\displaystyle\lim_{h \to 0} \sum_{n=0}^{1/h} f(anh)h + c$ since the sum covers the same interval in the limit.

If you recall the definition of the integral of any function $f$ from 0 to $x$, specifically $\displaystyle\lim_{h \to 0} \sum_{n=0}^{1/h} f(xnh)h$ you'll notice that the two sums are equal except by a factor of $c$ which can be easily recognized as the constant of integration that appears when ever one is doing any antidifferentiation.

  • Given a continuous real function $f$ we can define another continuous real function $g$ which goes through the point $(0,c)$ and $\forall d\in\mathbb D\quad g'(d)=f(d)$ where $\mathbb D$ is the domain of $f$. In general this is not true, for example take the Gaussian function $e^{-x^2/2}$, which does not have an anti derivative. If anyone has an idea how to do quotes in comments, please let me know! – Pink Panther Jul 29 '18 at 14:27
  • How would I avoid pitfalls like that, what condition would be relevant/necessary, or would I just say "given a function $f$ that has an antiderivative"? Edit: Wait, isn't it just some version of the error function? – Aaron Quitta Jul 29 '18 at 14:35
  • well seems like i cannot edit my post anymore. i was mistaken to say that it does not exist, i meant to say that there does not exist an antiderivative composed of elementary function. An antiderivative may still be given by $g(x)=\int_a^x f(t)dt$, if $f$ is a function $f:(a,b)\rightarrow\mathbb R$. so this is surely still continuous. so i am sorry, i guess i made a mistake there. – Pink Panther Jul 29 '18 at 14:43
  • Its totally okay, the error function is kind of a cop out anyway. – Aaron Quitta Jul 29 '18 at 15:04
  • @PinkPanther The function $e^{-x^2/2}$ certainly does have an antiderivative! There's no closed-form formula for the antiderivative. – David C. Ullrich Jul 29 '18 at 15:51

1 Answers1

1

I believe that when we do the first linear approximation:

We have $$g(h)=f(0)h+c+\epsilon_1(f,h)h$$

where $\epsilon_1(f,h)$ is the error term of which we know $\lim_{h \to 0} \epsilon_1(f,h) = 0$

Similarly as we do the second approximation, we have

$$g(2h)=g(h)+f(h)h+\epsilon_2(f,h)h$$ where $\lim_{h \to 0}\epsilon_2(f,h)=0$and so on.

I think you still have to show that $$\lim_{h \to 0}\sum_{i=1}^{a/h }h\epsilon_i(f,h)=0.$$

$$\lim_{h \to 0}\sum_{i=1}^{a/h }|h\epsilon_i(f,h)| \le \lim_{h \to 0} \frac{a}{h}\cdot h \sup_i |\epsilon_i(f,h)|=0.$$

Siong Thye Goh
  • 149,520
  • 20
  • 88
  • 149