2

Is there any way to prove that $$\large f(x)=\lim_{h\to0} {\frac{x^h-1}h}=\int_1^x1/t\ dt$$ Without knowing that both are the logarithm

It is clear that $f(1)=0$
And we would be done if we can prove that $f’(x)=\frac1x$, but that can only be done if we can switch the limit and differentiation operator and that can not be done always and I could not find any justification for doing so. So can we do so in this case and if so what is the justification?I know this can be proved in a much longer way but if we can justify the interchange the proof will be much shorter .

Vivaan Daga
  • 5,531

3 Answers3

3

Paraphrasing this comment

Note that $$ \int_1^xt^{h-1}\,\mathrm{d}t=\frac{x^h-1}h $$ For $x\gt1$, the integrand converges uniformly to $\frac1t$ on $[1,x]$. Therefore, the integral of the limit is the limit of the integral.


Uniform Convergence of $\boldsymbol{\lim\limits_{h\to0}{t^{h-1}=t^{-1}}}$

Assume that $t\ge1$ and $|h|\le1$. $$ \begin{align} 0 &\le\frac{t^{h}-1}{ht}\tag1\\ &=\frac{(1+(t-1))^h-1}{ht}\tag2\\ &\le\frac{(1+h(t-1))-1}{ht}\tag3\\ &=\frac{t-1}t\tag4\\[6pt] &\le1\tag5 \end{align} $$ Explanation:
$(1)$: $t\ge1$; if $h\ge0$, $t^h\ge1$; if $h\le0$, $t^h\le1$
$(2)$: algebra
$(3)$: Bernoulli's Inequality; the sense of the inequality
$\phantom{\text{(3):}}$ in the numerator is reversed between $h\in[0,1]$
$\phantom{\text{(3):}}$ and $h\le0$, but there is an $h$ in the denominator
$(4)$: algebra
$(5)$: $t\ge1$

Thus, we have shown that for $t\ge1$ and $|h|\le1$, $$ \left|\,t^{h-1}-t^{-1}\,\right|\le|h|\tag6 $$ which gives uniform convergence as $h\to0$.


Compact Subsets of $\boldsymbol{(0,1]}$

Given $t\in[\epsilon,1]$ for $\epsilon\gt0$, $$ \begin{align} |h| &\ge\left|\,t^{1-h}-t\,\right|\tag7\\ &=\left|\,t^{h-1}-t^{-1}\,\right|\ t^{2-h}\tag8\\ &\ge\left|\,t^{h-1}-t^{-1}\,\right|\epsilon^3\tag9 \end{align} $$ Explanation:
$(7)$: apply $(6)$ to $1/t$
$(8)$: pull $t^{2-h}$ out of the difference
$(9)$: since $t\ge\epsilon$ and $|h|\le1$, $t^{2-h}\ge\epsilon^3$

Thus, we have $$ \left|\,t^{h-1}-t^{-1}\,\right|\le|h|\epsilon^{-3}\tag{10} $$ which gives uniform convergence as $h\to0$.

robjohn
  • 345,667
  • We can also note that $t^{h-1}$ is a continuous function of $t, h$ in $[1,x]\times[0,\delta]$ so that the limit as $h\to 0$ and integral can be interchanged. – Paramanand Singh Jun 26 '20 at 14:19
  • @VivaanDaga: continuity of integrand in both variables is sufficient to interchange limit and integral. That's the usual way of switching limit with integral. – Paramanand Singh Jun 26 '20 at 14:35
  • @VivaanDaga: Uniform convergence does not require a sequence, just as limits do not require a sequence. Here is the definition of uniform convergence in the case of $t^{h-1}\to\frac1t$: $$\forall\epsilon\gt0,\exists\delta\gt0:(|h|\le\delta)\land(t\in[1,x])\implies\left|,t^{h-1}-\frac1t,\right|\le\epsilon$$ – robjohn Jun 26 '20 at 15:36
  • @VivaanDaga: that is the definition of uniform convergence. Uniform convergence implies that the integral and limit can be interchanged. – robjohn Jun 26 '20 at 16:14
  • One needs to be careful. $f_h(t)=\frac{h,t^2}{h^4+t^4}$ is continuous for $(t,h)\in(0,1)\times(-1,1)$ and $\lim\limits_{h\to0}f_h(t)=0$ for all $t\in(0,1)$. However, $$\lim_{h\to0}\int_0^1f_h(t),\mathrm{d}t=\frac\pi{2\sqrt2}$$ One needs uniform convergence. In this example, the convergence in $h$ is not uniform in $t$. – robjohn Jun 26 '20 at 16:53
  • Just wanted to point out that continuity on a compact domain is sufficient here. Your example on $(0,1)\times(-1,1)$ involves a non-compact domain. By the way +1 was already there. – Paramanand Singh Jun 26 '20 at 17:06
  • @ParamanandSingh: Yes. I wanted to point out that we need continuity and a compact domain in two dimensions, not just continuity in each variable. I think that the two dimensional continuity is harder to show than the one dimensional uniform continuity (in the comment after yours, the OP was also looking at the continuity in $t$ and $h$, not the two dimensional continuity; note that $f_h(t)$ is continuous in each variable, even on the compact rectangle). Since your previous comment did not mention compactness, that needed to be brought out, also. – robjohn Jun 26 '20 at 17:16
  • In this case the 2D continuity is not difficult (but I make use of logarithm). Since $t^{h-1}=\exp((h-1)\log t) $ we can see that there is separation of variables as $(h-1)\log t$ and finally $\exp$ is the most well behaved function. But yes 2d calculus is pain. – Paramanand Singh Jun 26 '20 at 17:20
  • As a general rule, if we avoid any singularities (in the functions and their derivatives), the convergence is uniform. $t^{h-1}\to\frac1t$ will be uniformly convergent as long as we avoid $t=0$ and $t=\infty$. In this case $t=\infty$, is not a problem for the uniform convergence, but the interval is not compact, so even uniform convergence is not good enough. – robjohn Jun 26 '20 at 17:33
  • Comments are not for extended discussion; this conversation has been moved to chat. – robjohn Jun 27 '20 at 10:45
  • @robjohn What is the name of this ? uniform convergence is usually always stated in terms Of sequences I could not find this anywhere – Vivaan Daga Jul 27 '20 at 15:27
  • It is uniform convergence. If you read the discussion we had before, the continuous version, which is proven, implies the sequential version. – robjohn Jul 27 '20 at 15:41
  • As I have tried to say twice before, since we have proved the continuous version, simply stick in $h=1/n$ and get your sequential version. You put $\lim\limits_{h\to0}$ in the question, so I was using that. – robjohn Jul 27 '20 at 19:17
  • @robjohn that is not my question it is -why is the following true - the limit and the integral can be switched if the convergence is uniform ? all the proofs of this use the sequence version of uniform convergence for the switch . does the proof follow from that ? can it be proved from the fact that the limit exists of $lim_{ x\to a} f(x) =f(a)$ iff $lim_{n\to \infty} f(p_n) =f(a)$ for all sequences s.t $p_n \to a $ – Vivaan Daga Jul 28 '20 at 08:46
  • We have proven that $\lim\limits_{x\to0}$ converges uniformly. Just let $x=\frac1n$ and now you have that $\lim\limits_{n\to\infty}$ converges uniformly. Apply the theorem for sequences. – robjohn Jul 28 '20 at 08:52
  • @robjohn i do not think using subsitution you can turn a continous limit into a discrtete one. i think the limit will still not be discrete if we use that subsitution . – Vivaan Daga Jul 28 '20 at 09:22
  • @robjohn otherwise you must know the limit exists which can be easily shown but why won’t the thereom I mentioned be enough we can just prove that we can exchange the limit for every possible sequence due to uniform convergence and since they converge to the same thing the continuous limit must also converge to it . but this may require the axiom of choice – Vivaan Daga Jul 28 '20 at 09:37
  • I have shown that $\lim\limits_{h\to0}t^{h-1}=\frac1t$ uniformly on compact subsets of $(0,\infty)$. That means that given a compact subset $I\subset(0,\infty)$ and $\epsilon\gt0$, we can find a $\delta\gt0$ so that if $|h|\le\delta$, then $\left|t^{h-1}-\frac1t\right|\le\epsilon$ for all $t\in I$. This shows that for any sequence of $h_n\to0$, we have that $\lim\limits_{n\to\infty}t^{h_n-1}=\frac1t$ uniformly on $I$. – robjohn Jul 28 '20 at 09:54
  • @robjohn yes but to show that we can interchange the limit and integral the proof only works for a sequence so to show it for a continuous limit we must use the theorem that I used in the above proof Is that correct? this is similar to how we can prove limit laws for continuous limits using the proof for the laws which have been established for sequences .I do not understand how the inequalities which have been shown in the answer implies uniform convergence . – Vivaan Daga Jul 28 '20 at 10:08
  • Look. I showed that $\left|t^{h-1}-\frac1t\right|\le\frac{|h|}{\min\left(1,x^3\right)}$ for all $t\in[1,x]$, and this shows uniform convergence, whether for a sequence or not. Thus, $$\begin{align}\left|,\frac{x^h-1}h-\log(x),\right|&=\left|\int_1^x\left(t^{h-1}-\frac1t\right)\mathrm{d}t,\right|\ &\le\int_1^x\left|,t^{h-1}-\frac1t,\right|\mathrm{d}t\ &\le |h|,\frac{|x-1|}{\min!\left(1,x^3\right)}\end{align}$$ Study this, and you'll see that the uniform convergence admits the exchange of limit and integral. – robjohn Jul 28 '20 at 10:53
  • @robjohn what will the delta be I do not think epsilon can depend on delta . What is wrong with my proof? – Vivaan Daga Jul 28 '20 at 11:16
  • In a proof of convergence, one chooses an $\epsilon\gt0$ and then finds a $\delta\gt0$. What proof are you asking about? In my last comment, I showed how the uniform bounds I got show that the exchange of limit and integral are okay. – robjohn Jul 28 '20 at 12:10
  • @robjohn yes we have have an delta for each epsilon what is the delta here how do the inequalities give you the delta? My proof is given $9$ , $7$ , $6$ , $4$ comments above this comment – Vivaan Daga Jul 28 '20 at 12:25
  • Given $\epsilon=\left|,\frac{x^h-1}h-\log(x),\right|$, $\delta=\frac{\min!\left(1,x^3\right)}{|x-1|},\epsilon$ suffices. – robjohn Jul 28 '20 at 12:49
  • Please read my previous comment. – robjohn Jul 29 '20 at 16:01
  • Can it be proven that this definition of uniform convergence implies uniform convergence of all possible sequences and vice-versa ? – Vivaan Daga Jul 30 '20 at 15:26
  • Yes. If $\lim\limits_{a\to0}f_a=g$ uniformly, then $\lim\limits_{n\to\infty}f_{a_n}=g$ uniformly as long as $\lim\limits_{n\to\infty}a_n=0$. I don't think vice-versa applies here. – robjohn Jul 30 '20 at 15:47
  • @robjohnSo uniform convergence for all possible sequences such that $lim a_{n} =0$ does not imply this uniform convergence? – Vivaan Daga Jul 30 '20 at 16:53
  • It does, but why would one try to prove that direction. Trying to prove uniform convergence for all possible sequences is most likely just window dressing on top of proving uniform convergence for a continuous index. – robjohn Jul 30 '20 at 17:31
  • @robjohn How will you prove that? (backwards) – Vivaan Daga Jul 30 '20 at 18:21
  • If it is false for the continuous case, there will be a sequence that fails to converge uniformly. – robjohn Jul 30 '20 at 18:54
  • @robjohn But that does not imply that uniform convergence of all possible sequences implies uniform convergence for continuous case – Vivaan Daga Jul 31 '20 at 17:34
  • Please read about contrapositives. – robjohn Jul 31 '20 at 18:06
  • @robjohn how will you construct such a sequence ? – Vivaan Daga Jul 31 '20 at 18:48
  • Uniform Convergence (continuous) $$ \forall\epsilon\gt0,\exists\delta\gt0:\forall|t|\le\delta,|f_t-f|\infty\le\epsilon $$ The negation of this is $$ \exists\epsilon\gt0:\forall\delta\gt0,\exists|t|\le\delta:|f_t-f|\infty\gt\epsilon $$ Start with $\delta_1=1$. For each $n$ we are guaranteed of a $t_n$ so that $|t_n|\le\delta_n$ and $|f_{t_n}-f|\infty\gt\epsilon$. Then set $\delta{n+1}=|t_n/2|$, and continue. This gives a sequence of $t_n\to0$ and $|f_{t_n}-f|_\infty\gt\epsilon$. – robjohn Jul 31 '20 at 19:45
  • @robjohn to choose the $t_{n}$ are you using some form of Axiom of Choice? – Vivaan Daga Aug 02 '20 at 15:56
  • ... and we're done. – robjohn Aug 02 '20 at 16:36
  • @robjohn in the definition of uniform convergence why is there no second variable ? I think uniform convergence is on a set X such that for each $x\in X$ the above condition is satisfied – Vivaan Daga Aug 03 '20 at 13:28
  • You need to learn how to figure some of this out for yourself: $|f_t-f|\infty=|f_t(x)-f(x)|{L^\infty(X)}$. The former notation is not uncommon, so expect to see it elsewhere. $|\cdot|_\infty$ usually means the sup norm or essential sup norm. – robjohn Aug 03 '20 at 16:02
  • @robjohn why can’t you use $\delta_{n}$ =$ 1/n$ for $n \in \mathbb{N}$ – Vivaan Daga Aug 04 '20 at 09:42
  • The way this is done is that you choose an $\epsilon\gt0$ and then compute a $\delta\gt0$ so that the estimates obey the proper inequalities. We don't choose the $\delta$ first. – robjohn Aug 06 '20 at 16:40
  • @robjohn but why will it not work let $\delta_{n} = 1/n$ ,$n\in N$ then it is needed that for all delta there exists an epsilon which negates uniform convergence since $1/n$ is >0 for each delta we are guaranteed $t_{n}$<delta such that $ ||f_{t}-f||{\infty}$ < $\epsilon$ thus $t{n}$ sequence will not converge uniformly . But choosing the $t_{n}$ still require some axiom of choice . – Vivaan Daga Aug 09 '20 at 09:40
  • @robjohn is the statement of the interchange decidable? Because this proof needs axiom of choice . If it is decidable then it does not need it. – Vivaan Daga Aug 09 '20 at 09:42
  • For each $n$, this is a choice of one element from one non-empty set, hardly a problem for decidability. You can use $\delta_n=1/n$, but you might get a lot of duplicate terms or terms in reverse order. I specified $\delta_{n+1}=|t_n/2|$ to avoid this by constructing a decreasing sequence. – robjohn Aug 09 '20 at 20:48
  • @robjohn proving that there is a function K and $K\to 0$ such that $ | f_{t}-f(x)|$ < K for all x in [a,b] and t in the interval around the limit being taken then you can interchange integral but here uniform convergence may not be true .so uniform convergence was not needed I think – Vivaan Daga Aug 20 '20 at 06:26
  • @robjohn Is the above property along with Riemann integrabilty is enough for the interchange ? Because I think it is weaker than uniform convergence . If there was uniform convergence then there will be such a function , $K(t)= sup|f_{t}-f(x)|$ sup fro all x in integration region . – Vivaan Daga Aug 20 '20 at 14:05
  • @VivaanDaga: please be more precise: "proving that there is a function K and $K\to 0$ such that ..." does not specify of what $K$ is a function. To apply it to my answer, $K$ would be a function of $t$. If that is the case, then what you have written is the definition of uniform convergence, so I don't see how this negates the applicability of uniform convergence. Besides, I have shown that uniform convergence is sufficient, not necessary. – robjohn Aug 20 '20 at 15:40
  • @robjohn how does that imply uniform convergence ? – Vivaan Daga Aug 20 '20 at 15:58
  • @VivaanDaga: $|f_t(x)-f(x)|\le K(t)$ for all $x$ where $\lim\limits_{t\to0}K(t)=0$ is the definition of uniform convergence. Please read and understand what uniform convergence means. It is not about the fact that the sequence of functions is indexed by the positive integers $n$ tending to $\infty$ or reals $t$ tending to $0$. It is that $|f_n(x)-f(x)|$ or $|f_t(x)-f(x)|$ is uniformly small for all $x$ in whatever domain is applicable for $n$ big enough or $t$ small enough. – robjohn Aug 20 '20 at 18:32
  • @robjohn yes proving such a function exists will imply uniform convergence but uniform convergence will also imply such a function exists by taking supremum . – Vivaan Daga Aug 20 '20 at 19:32
  • @VivaanDaga: Yes. That is why I said that that is the definition of uniform convergence (or equivalent to it). Your question of how that implies uniform convergence indicates you didn't think it did. – robjohn Aug 20 '20 at 19:50
1

Here is a much simpler approach which avoids differentiation altogether.

Let's use different symbols for different forms $$f(x) =\lim_{h\to 0}\frac{x^h-1}{h},g(x)=\int_{1}^{x}\frac{dt}{t}\tag {1}$$ The definition of $g$ is easier to handle analytically because integrand $1/t$ is continuous on $(0,\infty) $ and hence $g$ is well defined on $(0,\infty) $.

It can be proved with some effort that the limit used in definition of $f$ exists for all $x>0$. Now using this fact we do a substitution $h=1/n$ where $n$ is a positive integer. This gives us $$f(x) =\lim_{n\to \infty} n(x^{1/n}-1)\tag{2}$$ It is now easy to prove that $f(x) =g(x) $ for all $x>0$. It should be obvious that $f(1)=g(1)=0$ and further it is easily proved that $$f(1/x)=-f(x),g(1/x)=-g(x)\tag{3}$$ so it is sufficient to show that $f(x) =g(x) $ for $x>1$.

Let us choose a partition $$P=\{x_0,x_1,x_2,\dots,x_n\} $$ of $[1,x]$ such that $x_k=q^{k} $ where $q^n=x$ and we choose tag points $t_k=x_{k-1}$. The corresponding Riemann sum for the integral defining $g(x) $ is $$\sum_{k=1}^{n}\frac{x_{k}-x_{k-1}}{x_{k-1}}=\sum_{k=1}^{n} \frac{q^k-q^{k-1}}{q^{k-1}}=n(q-1)=n(x^{1/n}-1)$$ and thus the integral equals the limit of this Riemann sum and we get $$g(x) =\lim_{n\to \infty} n(x^{1/n}-1)=f(x)$$


Here is a proof based on discussion in comments that the limit in question exists for all $x>0$. For this we let $x>0$ be fixed and consider $F(h)=x^h$. If $x>1$ then $F(h) >1$ if $h>0$ and $F(h) <1$ if $h<0$. These inequalities get reversed if $0<x<1$. Since $$F(t+s) =F(t) F(s) $$ it follows that for $x>1$ the function $F$ is strictly increasing and for $0<x<1$ it is strictly decreasing (for $x=1$ it remains a constant).

Thus $F(h) $ is a monotone function of $h$. It follows via a standard theorem on monotone functions that $F$ is continuous everywhere except at most a countable number of points. Thus $F$ is continuous at some point $a$. And we have $$F(h) =F(h-b+a+b-a) =F(h-b+a) F(b-a) $$ If $h\to b$ then $h-b+a\to a$ and thus by continuity at $a$ we have $F(h-b+a) \to F(a) $ and so $F(h) \to F(a) F(b-a) =F(b) $ as $h\to b$. This proves that $F$ is continuous at any point $b$ and so it is continuous everywhere.

It follows that $$G(t) =\int_{0}^{t}F(h)\,dh$$ exists and $G'(h) =F(h) $ for all $h$. Integrating the functional equation $$F(t+h) =F(t) F(h) $$ with respect to $h$ we get $$G(t+h) - G(t) =F(t) G(h) $$ Note that $G(0)=0$ and if $G$ is a constant then $F=G'$ is also a constant. Otherwise there is an $h$ such that $G(h) \neq 0$. And then we have $$F(t) =\frac{G(t+h) - G(t)} {G(h)} $$ and the right hand side is clearly a differentiable function of $t$ so that $F$ is differentiable everywhere with derivative $$F'(t) =\frac{F(t+h) - F(t)} {G(h)} $$ In particular $F'(0)$ exists and this means that the limit in question exists.

  • @VivaanDaga: You have yourself presented a limit definition for $f$ as limit of $(x^h-1)/h$ as $h\to 0$. It means you should know that the limit exists. Anyway this is not a trivial problem. In order to show that the limit exists you first need to define $x^h$ for all real $h$ and then show that $(x^h-1)/h$ is increasing as $h$ increases for $x>1$. Since the expression is also bounded below it follows the limit as $h\to 0^{+}$ exists. The case $h\to 0^{-}$ is handled by putting $h=-k$. For $x<1$ prove that $f(x) =-f(1/x)$ so that limit exists in all cases. – Paramanand Singh Jun 27 '20 at 12:17
  • @VivaanDaga: as $h$ decreases from positive values to $0$ (in $h\to 0^{+}$) the function decreases and remains bounded below (greater than $0$), it follows that the limit exists. This is similar (proof being similar) to the fact that an increasing sequence which is bounded above is convergent. – Paramanand Singh Jun 27 '20 at 13:31
  • @VivaanDaga: for negative values of $h$ it is much simpler to put $k=-h$ so that $k\to 0^{+}$ and proceed further. Otherwise one needs to figure out how the function behaves as $h$ increases from negative values to $0$. – Paramanand Singh Jun 27 '20 at 14:03
  • @VivaanDaga: Well what I say is that we don't need to know what happens when $h<0$ directly, but rather by putting $k=-h$ we can infer this information from the already known information for positive values of $h$. Why work more when one can achieve the goal with less effort (by building on what has been achieved earlier)? – Paramanand Singh Jun 27 '20 at 15:07
  • @VivaanDaga: you can prove this directly via inequalities. See this answer (see equation $(4),(5)$). Taking derivatives won't be good idea as the derivative involves logarithm and this would make it circular. – Paramanand Singh Jul 07 '20 at 07:30
  • @VivaanDaga: I don't really get the point of your last comment. What are you trying to do and using what means? Clarify. If you wish to find derivative $(d/dh) x^h$ you first need to deal with the limit in question. – Paramanand Singh Jul 07 '20 at 08:07
  • @VivaanDaga: also a function being monotonic does not necessarily imply that it is differentiable. – Paramanand Singh Jul 07 '20 at 08:17
  • @VivaanDaga: if you assume $f$ to be continuous at some point and the equation $f(x+y) =f(x) f(y) $ then $f$ is continuous everywhere and thus integral/anti-derivative of $f$, say $F$, exists. Integrating the functional equation with respect to $y$ we get $F(x+y) - F(x) =f(x) [F(y)-F(0)]$. If $F$ is not a constant then we have some specific $y$ for which $F(y) \neq F(0)$ and then $$f(x) =\frac{F(x+y) - F(x)} {F(y) - F(0)}$$ and the RHS is differentiable as $F$ is differentiable so that $f$ is differentiable. No need to know that $f$ is monotone. – Paramanand Singh Jul 07 '20 at 11:01
  • @VivaanDaga: assuming monotone $f$ one can prove easily that $f$ is discontinuous at a countable number of points and hence continuous somewhere and then we apply the technique of last comment. You can see that the functional equation constrains the function in a rather strong manner and the function does not have much chance to behave badly. In other words the functional equation makes it fall in line. – Paramanand Singh Jul 07 '20 at 11:04
  • @VivaanDaga: I updated my answer to include the discussion in comments. – Paramanand Singh Jul 07 '20 at 23:51
  • @VivaanDaga: you should full details and not just individual sentences. Before the functional equation I have mentioned about $F(h) >1$ for $h>0$ and reverse inequality if $0<x<1$. This fact combined with functional equation makes the function $F$ monotone. Check for yourself. – Paramanand Singh Jul 08 '20 at 04:29
  • @VivaanDaga: write $x=r(\cos t+i\sin t) $ and try evaluating $n(x^{1/n}-1)$. You should get the limit as $\log r+it$. – Paramanand Singh Jul 20 '20 at 08:04
  • @VivaanDaga: It would require the fact that the limit exists for positive $x$ (which is already discussed here). In short the theory for complex $x$ can be building using the already built theory for real positive $x$. – Paramanand Singh Jul 20 '20 at 09:59
  • @VivaanDaga: when you write $x=r(\cos t+i\sin t) $ then $r>0$ and thus the logarithm of a complex number can be handled by using existing knowledge of logarithm of a positive real number. You should just put $x$ in this form and evaluate limit of $n(x^{1/n}-1)$. A little bit of algebra should give you $\log r+it$. – Paramanand Singh Jul 20 '20 at 11:12
  • @VivaanDaga: it appears that you keep asking questions but never read the replies given in comments. Read my last 3-4 comments and there is no mention of $a^x$ or $e^x$ in them. Instead deal with expression $n(x^{1/n}-1)$. – Paramanand Singh Jul 20 '20 at 12:14
  • @VivaanDaga : I have no better advice than to ask you to read all my comments again. It appears you haven't understood any of them otherwise you wouldn't ask these questions. – Paramanand Singh Jul 20 '20 at 12:21
  • @VivaanDaga: Perhaps you are aware of DeMoivre's theorem. It's a standard result when you study complex numbers for the first time. Use it and you are done. – Paramanand Singh Jul 20 '20 at 12:34
  • @VivaanDaga: DeMoivre's works for rational exponents and thus $$n(x^{1/n}-1)=n(r^{1/n}(\cos(t/n)+i\sin(t/n)) - 1)=n(r^{1/n}-1)+ir^{1/n}n\sin(t/n) +r^{1/n} n(\cos(t/n) - 1)$$ which tends to $$\log r+i\cdot 1\cdot t+1\cdot 0$$ – Paramanand Singh Jul 20 '20 at 13:58
  • @VivaanDaga: you seem to have some confusion regarding DeMoivre's. It has nothing to do with Euler stuff. It's a plain algebra stuff proved in almost any textbook which introduces complex numbers. Using this theorem one proves that there are $n$ nth roots of a complex number (i hope you are familiar with this). Don't make or assume things to be more complicated than they actually are. – Paramanand Singh Jul 20 '20 at 14:18
  • For complex numbers $x^h$ will be multi valued how can it be made into a function ? – Vivaan Daga Jul 26 '20 at 15:46
  • I do not understand the meaning of ‘in particular F’(0) exists’ does this not prove that F’(x) exists for all x? Or is it that lim f(x)$x^h-1/h$ as h goes to 0 the limit will only exist if $f(x)$ not equal to $0$ ? – Vivaan Daga Jul 27 '20 at 13:39
  • @VivaanDaga: can you notice that $F'(0)=\lim_{h\to 0}\dfrac{x^h-1}{h}$ ? – Paramanand Singh Jul 27 '20 at 14:25
  • it is the same for F’(1) also – Vivaan Daga Jul 27 '20 at 14:27
  • Because F(1) is not $0$ and existence of lim c xyz => existence of lim xyz if c does not equal to $0$ is that correct? – Vivaan Daga Jul 27 '20 at 14:28
  • Since existence of the limit implies the existence of the derivative and backwards . – Vivaan Daga Jul 27 '20 at 14:32
0

Note that $\int_1^x t^{h-1} \mathrm{d} t = \frac{t^h}{h}\big\vert_1^x = \frac{x^h - 1}{h}$. It suffices to prove that $$\lim_{h\to 0} \int_1^x \frac{t^h - 1}{t} \mathrm{d} t = 0.$$

We split into three cases:

  1. $x > 1$: Note that, for $1 \le t \le x$, $$0 \le \frac{t^h - 1}{t} \le x^h - 1$$ and $$0 \le \int_1^x \frac{t^h - 1}{t}\mathrm{d} t \le (x^h - 1)(x-1).$$ Note that $\lim_{h\to 0} (x^h - 1)(x-1) = 0$. Thus, by the squeeze theorem, we have $\lim_{h\to 0} \int_1^x \frac{t^h - 1}{t} \mathrm{d} t = 0$.

  2. $0 < x < 1$: We have $\int_1^x \frac{t^h - 1}{t} \mathrm{d} t = \int_x^1 \frac{1 - t^h}{t} \mathrm{d} t$. Note that, for $x\le t \le 1$, $$0 \le \frac{1 - t^h}{t} \le \frac{1 - x^h}{x}$$ and $$0 \le \int_x^1 \frac{1 - t^h}{t} \mathrm{d} t \le \frac{1 - x^h}{x}(1-x).$$ Note that $\lim_{h\to 0} \frac{1 - x^h}{x}(1-x) = 0$. Thus, by the squeeze theorem, we have $\lim_{h\to 0} \int_x^1 \frac{1 - t^h}{t} \mathrm{d} t = 0$.

  3. $x=1$: It is obvious.

We are done.

River Li
  • 37,323