20

Let $0<x<1$ and $f(x)=x^{x^{x^{x}}}$ then we have :

Claim :

$$f''(x)\geq 0$$

My attempt as a sketch of partial proof :

We introduce the function ($0<a<1$):

$$g(x)=x^{x^{a^{a}}}$$

Second claim : $$g''(x)\geq 0$$

We have :

$g''(x)=x^{x^{a^{a}}+a^a-2}(a^{\left(2a\right)}\ln(x)+x^{a^{a}}+2a^{a}x^{a^{a}}\ln(x)-a^{a}\ln(x)+2a^{a}+a^{\left(2a\right)}x^{a^{a}}\ln^{2}(x)-1)$

We are interested by the inequality :

$$(a^{\left(2a\right)}\ln(x)+x^{a^{a}}+2a^{a}x^{a^{a}}\ln(x)-a^{a}\ln(x)+2a^{a}+a^{\left(2a\right)}x^{a^{a}}\ln^{2}(x)-1)\geq 0$$

I'm stuck here .


As noticed by Hans Engler we introduce the function :

$$r(x)=x^{a^a}\ln(x)$$ We have :

$$r''(x)=x^{a^a - 2} ((a^a - 1) a^a \ln(x) + 2 a^a - 1)$$

The conclusion is straightforward the function $\ln(g(x))$ is convex so it implies that $g(x)$ is also convex on $(0,1)$.

Now starting with the second claim and using the Jensen's inequality we have $x,y,a\in(0,1)$:

$$x^{x^{a^{a}}}+y^{y^{a^{a}}}\geq 2\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)^{a^{a}}}$$

We substitute $a=\frac{x+y}{2}$ we obtain :

$$x^{x^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}+y^{y^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}\geq 2\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}$$

Now the idea is to compare the two quantities :

$$x^{x^{x^{x}}}+y^{y^{y^{y}}}\geq x^{x^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}+y^{y^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}$$

We split in two the problem as :

$$x^{x^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}\leq x^{x^{x^{x}}}$$

And :

$$y^{y^{y^{y}}}\geq y^{y^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}$$

Unfortunetaly it's not sufficient to show the convexity because intervals are disjoint .


A related result :

It seems that the function :

$r(x)=x^x\ln(x)=v(x)u(x)$ is increasing on $I=(0.1,e^{-1})$ where $v(x)=x^x$ . For that differentiate twice and with a general form we have : $$v''(x)u(x)\leq 0$$ $$v'(x)u'(x)\leq 0$$ $$v(x)u''(x)\leq 0$$

So the derivative is decreasing on this interval $I$ and $r'(e^{-1})>0$

We deduce that $R(x)=e^{r(x)}$ is increasing . Furthermore on $I$ the function $R(x)$ is concave and I have not a proof of it yet .

We deduce that the function $R(x)^{R(x)}$ is convex on $I$ . To show it differentiate twice and use a general form like : $(n(m(x)))''=R(x)^{R(x)}$ and we have on $I$ :

$$n''(m(x))(m'(x))^2\geq 0$$

And :

$$m''(x)n'(m(x))\geq 0$$

Because $x^x$ on $x\in I$ is convex decreasing .

Conlusion :

$$x^{x^{\left(x^{x}+x\right)}}$$ is convex on $I$

The same reasoning works with $x\ln(x)$ wich is convex decreasing on $I$ .

Have a look to the second derivative divided by $x^x$

In the last link all is positive on $J=(0.25,e^{-1})$ taking the function $g(x)=\ln\left(R(x)^{R(x)}\right)$


Question :

How to show the first claim ?Is there a trick here ?

Ps:feel free to use my ideas .

  • Hint: If $h$ is convex, then $\tilde h = e^h$ is also convex. So it's enough to prove that $\log f(x)$ is convex. Now iterate this argument. – Hans Engler Jun 08 '21 at 13:08
  • 5
    @HansEngler It works only one time unfortunately... – Miss and Mister cassoulet char Jun 08 '21 at 13:14
  • When you will get an answer, could you ping me, please ? Very interesting problem. Cheers :-) – Claude Leibovici Jun 08 '21 at 14:01
  • 4
    @Claude: If you click on "Follow" then you will be notified automatically :) – Martin R Jun 08 '21 at 15:08
  • We have for $y\in(0.5,1)$ and $x\in(0,0.15)$ the two inequalities above or :$x^{x^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}\leq x^{x^{x^{x}}}$ and $y^{y^{y^{y}}}\geq y^{y^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}}$ .Any suggestion is welcome ! – Miss and Mister cassoulet char Jun 09 '21 at 11:28
  • Hint: Let $F(x) = x^{x^x}\ln x$. Prove that $F''(x) \ge 0$ directly using bounds for $x^x$ and $\ln x$ etc. $F''(x)$ looks complicated but it is not very difficult. – River Li Jun 16 '21 at 04:13
  • @ErikSatie have you had the opportunity to view my solution? Given that you have a bounty out for this problem, I'd appreciate it if you could give some feedback if you are unsatisfied with what I posted. – Squirtle Jun 21 '21 at 14:08
  • Sry, but look: https://hsm.stackexchange.com/q/13257 – LаngLаngС Jun 21 '21 at 16:13
  • 1
    @ErikSatie Suggestion for editing: In 1st line, you let $f(x)=x^{x^{x^{x}}}$, but in 'Edit' after Equ. (I), you let $f(x) = \ln(x^{x^{x^a}})$. Also, you let $r(x) = x^{a^a}\ln x$ and later $r(x) = x^x\ln x$. You may tag{label} the equation for easy discussing (you have done it for Equ. (I)). You may use Edit1, Edit2, etc. rather that Edit, Edit, Edit etc. Also in each Edit you may use
    to separate them.
    – River Li Jun 27 '21 at 01:14
  • 1
    @ErikSatie "These two last inequality are not hard and it conducts to a partial solution in using the continuity + Jensen definition" is not clear. – River Li Jun 27 '21 at 01:15
  • @RiverLi Is it clearer now ? Thanks ! – Miss and Mister cassoulet char Jun 27 '21 at 10:11
  • @ErikSatie So, though $g(x)=x^{x^{a^{a}}}$ is convex, it is not helpful? After "A related result" : You should not define $r(x)$ again since $r(x)=x^{a^a}\ln(x)$ (see "As noticed by Hans Engler we introduce the function :"). – River Li Jun 27 '21 at 11:29
  • @ErikSatie The part " A related result" is not clear for me. I don't know what you want to do in this part. I think that you should first give the conclusion or conjecture then give your thoughts. – River Li Jun 27 '21 at 11:32

3 Answers3

5

Proof:

First, we have that

(1) Midpoint convex implies rational convex

(2) And Rational convex plus continuous implies convex.

See Midpoint-Convex and Continuous Implies Convex for details on this.

So it suffices to prove the function is mid-point convex. Let $x\in (0,1)$ and let $y\in (x,1)$ then we want to show that

$$f(\frac{x+y}{2}) \le \frac{f(x)+f(y)}{2}$$

That is, we want to show that

$$\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)^{\left(\frac{x+y}{2}\right)}}} \le \frac{x^{x^{x^{x}}}+y^{y^{y^{y}}}}{2}$$

but this follows from the argument in your original post.

Squirtle
  • 6,698
  • The first claim, follows from the fact that this function is convex (though I'm sure you intended to use that claim together with the second to prove the function is convex). This should work though. – Squirtle Jun 20 '21 at 18:54
  • Hello no problem for a feedback .I think there is no validated argument concerning your last inequality I mean it remains to show it . – Miss and Mister cassoulet char Jun 21 '21 at 14:13
  • @ErikSatie it's just the combination of two inequalities you already showed. Aren't those two proven ? – Squirtle Jun 21 '21 at 14:19
  • The problem is there are some constraints on $x,y$...See the comments – Miss and Mister cassoulet char Jun 21 '21 at 14:23
  • 1
    Oh I see..... I'm at a lost on how to improve the problem. Sorry and good luck. I'm going to leave this up though because I believe once a few issues get addressed then this will become applicable. – Squirtle Jun 21 '21 at 14:45
1

Mathematica 12.3 does it in moment by

NMinimize[{D[x^x^x^x, {x, 2}], x > 0 && x < 1}, x]

$\{0.839082,\{x\to 0.669764\}\}$

Since the minimum value of the second derivative on $(0,1)$ is positive, the function under consideration is convex on $(0,1)$.

Addition. The @RiverLi user states doubts concerning the NMinimize result. Here are additional arguments. First, as

D[x^x^x^x, {x, 2}] // Simplify

$ x^{x^x+x^{x^x}-2} \left(x^{2 x} \log (x) \left(x \log ^2(x)+x \log (x)+1\right)^2+x^{x+1} \log ^2(x)+x^{x+2} \log ^2(x) (\log (x)+1)^2+3 x^{x+1} \log (x) (\log (x)+1)+2 x^x+x^x \log (x) (x+x \log (x)-1)+x^{x^x} \left(x^x \log (x) \left(x \log ^2(x)+x \log (x)+1\right)+1\right)^2-1\right)$ and

D[x^x^x^x, {x, 3}] // Simplify

$x^{x^x+x^{x^x}} \left(x^{3 x} \log (x) \left(\frac{1}{x}+\log ^2(x)+\log (x)\right)^3+x^{2 x-3} \left(x \log ^2(x)+x \log (x)+1\right)^2+x^{x-2} \left(\frac{1}{x}+\log ^2(x)+\log (x)\right) \left(x^{x+1} \log (x) (\log (x)+1)+x^x-1\right)+x^{2 x^x} \left(x^x \log (x) \left(\frac{1}{x}+\log ^2(x)+\log (x)\right)+\frac{1}{x}\right)^3+3 x^{x^x-3} \left(x^{x+1} \log ^3(x)+x^{x+1} \log ^2(x)+x^x \log (x)+1\right) \left(x^{2 x+2} \log ^5(x)+2 x^x+\left(x^x+4 x-1\right) x^x \log (x)+\left(2 x^x+1\right) x^{x+2} \log ^4(x)+\left(x^{x+1}+2 x^x+2 x\right) x^{x+1} \log ^3(x)+\left(2 x^x+x+5\right) x^{x+1} \log ^2(x)-1\right)+2 x^{x-3} \left(x^2 \log ^3(x)+2 x^2 \log ^2(x)+2 x+x (x+3) \log (x)-1\right)+3 x^{2 x-3} \log (x) \left(x \log ^2(x)+x \log (x)+1\right) \left(x^2 \log ^3(x)+2 x^2 \log ^2(x)+2 x+x (x+3) \log (x)-1\right)+\frac{\left(x^{x+1} \log (x) (\log (x)+1)+x^x-1\right)^2}{x^3}+\frac{x^{x+1} \log (x)+2 x^{x+1} (\log (x)+1)+x^{x+2} \log (x) (\log (x)+1)^2-x^x+1}{x^3}+x^{x-3} \log (x) \left(x^3 \log ^4(x)+3 x^3 \log ^3(x)+3 x^2+3 x^2 (x+2) \log ^2(x)+x \left(x^2+9 x-4\right) \log (x)+2\right)\right)$ show, the second derivative is continuously differentiable on $(0,1]$. Therefore, we can draw a conclusion that the second derivative is continuously differentiable on $[0.01,1]$.

Second,

Limit[D[x^x^x^x, {x, 2}], x -> 0, Direction -> "FromAbove"]

$\infty$

and

D[x^x^x^x, {x, 2}] /. x -> 0.01

$77.923$

and

D[x^x^x^x, {x, 2}] /. x -> 1

$2$

Third, the command of Maple (here Maple is stronger than Mathematica)

DirectSearch:-SolveEquations(diff(x^(x^(x^x)), x $ 3) = 0, {0 <= x, x <= 1}, AllSolutions);

$$\left[\begin{array}{cccc} 2.58795803978585\times10^{-24} & \left[\begin{array}{c} - 1.60871316268185183\times10^{-12} \end{array}\right] & \left[x= 0.669764056702161\right] & 23 \end{array}\right] $$ shows there is only one critical point of the second derivative on $[0.01,1]$. Combining the above with the value of the second derivative at $x=0.669764056702161$, i.e. with $0.83908$, and with the result of

NMaximize[{D[x^x^x^x, {x,3}] // Simplify, x >= 0 && x <= 0.01}, x]

$\{-4779.93,\{x\to 0.01\}\},$

we conclude that the second derivative takes its global minimum on $(0,1]$ at $x=0.669764056702161$ .

user64494
  • 5,811
  • Can the global optimality be ensured? According to https://reference.wolfram.com/language/ref/NMinimize.html: If f and cons are linear or convex, the result given by NMinimize will be the global minimum, over both real and integer values; otherwise, the result may sometimes only be a local minimum. – River Li Jun 17 '21 at 02:00
  • By the way, it is not me who downvoted you. – River Li Jun 17 '21 at 04:21
  • @RiverLi: See the appendix to my answer where additional arguments are given . – user64494 Jun 17 '21 at 15:51
  • Thanks for updating your answer. You use the package DirectSearch etc. I was wondering if there is any place in the documentation of the packages saying that it ensures (i.e. 100%) the global optimality or finding all solutions of equations ( except for polynomial equations)? Furthermore, are there research papers behind the packages to tell us how to know that the point is global optimal, or all the stationary points are found. – River Li Jun 17 '21 at 16:22
  • @RiverLi: The command of Maple DirectSearch:-GlobalOptima(diff(x^(x^(x^x)), x $ 2), {0 <= x, x <= 1}) results in $$ [ 0.839082340902234,\left[x= 0.669764056669931\right],151],$$ confirming NMinimize. See that article concerning methods used be the DirectSerch. Your question about other CASes is too wide. Ask it as a separate question here and/or other StackExchange forums. – user64494 Jun 18 '21 at 04:44
  • @RiverLi: The uniqueness of the root of the third derivative of $x^{x^{x^x}}$ is confirmed by the command of Mathematica Table[FindRoot[Evaluate[D[x^x^x^x, {x, 3}]] == 0, {x, k/10}], {k, 1, 10}] wich performs {x -> 0.669764056702257} ten times.

    and another code of Maple fsolve(diff(x^(x^(x^x)), x $ 3) = 0, x = 0 .. 1) which performs $0.6697640567$ and then fsolve(diff(x^(x^(x^x)), x $ 3) = 0, x = 0 .. 1, avoid = {x = %}) returns the input (This means there is no another root.). Don't hesitate to ask for further explanation in need.

    – user64494 Jun 18 '21 at 04:49
  • Thanks. I read the article by Sergey N. Moiseev (I also found it in arXiv https://arxiv.org/ftp/arxiv/papers/1102/1102.1347.pdf). Yes, the articles give some examples to describe its effectiveness, however I do not see (perhaps I am missing something) mathematical proof that it will converge to a global optimal solution, or any criterion for the convergent point to be global optimal (in contrast, for local optimality, $f'(x) = 0$ and $f''(x) > 0$ are sufficient conditions). – River Li Jun 18 '21 at 05:40
  • @RiverLi: What do you mean by $f$? Sorry, I dislike empty talks. – user64494 Jun 18 '21 at 14:53
  • Sorry for not defining $f$. Let me give more details. We have sufficient conditions for local optimality: For a differentiable function $f(x)$, if $f'(x_0) = 0$ and $f''(x_0) > 0$, then $x_0$ is a local minimizer of $f(x)$. But for global optimality, I do not see criterion for global optimality in the article by Sergey N. Moiseev. In other words, I do not see mathematical proof that the algorithm of the article will converge to a global optimal solution. By criterion for global optimality, I mean something like: If blablalba, then $x_0$ is a global minimizer. – River Li Jun 18 '21 at 15:13
  • Let's go back to our rams. The function under consideration, i.e. the second derivative of $x^{x^{x^x}}$, is analytic on $(0,1]$. As shown in my answer and my comments, that function decreases on $(0,0.1)$ and takes its minimum on $[0.1,1]$. This minimum is reached at the end points or in inner critical points. Both Maple and Mathematica show the only critical point $x=0.669764$ and the value of the second derivative at this point is positive and less than its values at end-points. Therefore, this is a point of global minimum o the second derivative on $(0,1]$. T – user64494 Jun 18 '21 at 17:30
  • The DirectSearch and NMinimize confirm it. If you can suggest another point as an alternative, please, present it. If you see concrete errors in my answer, please, indicate the ones. I don't know any math proof in mathematical analysis exposed absolutely rigorously. – user64494 Jun 18 '21 at 17:30
  • There are some theory behind the packages such as the article by Sergey N. Moiseev, right? In other words, the package is written based on the article, right? However, I did not see mathematical proof that the algorithm of the article will converge to a global optimal solution. Just as I told you NMinimize sometimes only converges to local minimizer (according to the documentation of the command NMinimize), I do not know if DirectSearch ensures global optimality. If so, I was wondering if there is any place in the documentation or research papers related to DirectSearch to show it. – River Li Jun 18 '21 at 23:46
  • Actually, in the beginning, you just say NMinimize tell us the global optimality. However, it doesn't. You just think NMinimize will do it. You are wrong. Then you found it and updated your answer. Now, you say DirectSearch tell us the global optimality. You just think DirectSearch will do it. How do you know you are right? Any documentation or articles about it? – River Li Jun 18 '21 at 23:49
  • In other words, is there any evidence that DirectSearch ensures global optimality? For NMinimize, I found the evidence (the documentation) that NMinimize sometimes only gives local optimality (but in the beginning, you think it will give global optimality). I hope to see the evidence. Is it clear what I mean, now? – River Li Jun 19 '21 at 00:04
  • @RiverLi: I repeat in my answer and in my comments the convexity is shown with help of two CASes by using optimizators and by mimicking the standard way of calculus. If you see concrete errors in my answer, please, indicate the ones. I prefer arguments over eotional words. Hope I am clear. – user64494 Jun 19 '21 at 04:16
  • Two day ago, you ran NMinimize and said "the function is convex". I told you No, wait a minute, from the documentation of NMinimize, it does not always converge to a global minimizer, sometimes a local minimizer. So it is not a proof. Then you ran DirectSearch. I said perhaps the algorithm of DirectSearch does not always converge to a global minimizer. You need to provide evidence . The evidence includes the documentation of the package DirectSearch such as saying "the algorithm of DirectSearch always converges to global minimizer". CAS does not always work as you have seen NMinimize fails. – River Li Jun 19 '21 at 04:33
  • For example, you ran DirectSearch:-SolveEquations(diff(x^(x^(x^x)), x $ 3) = 0, {0 <= x, x <= 1}, AllSolutions); and said "shows there is only one critical point of the second derivative". Can you provide evidence e.g. the documentation of the package DirectSearch that DirectSearch:-SolveEquations always (100%, never fails) gives all the solutions? If so, I agree you provide a proof. – River Li Jun 19 '21 at 04:49
  • @RiverLi: You wrote "Can you provide evidence e.g. the documentation of the package DirectSearch that DirectSearch:-SolveEquations always (100%, never fails) gives all the solutions? ". No, of course. Do you disagree with the result of DirectSearch:-SolveEquations(diff(x^(x^(x^x)), x $ 3) = 0, {0 <= x, x <= 1}, AllSolutions);? If so, please give arguments. In other case the further discussion makes no sense. – user64494 Jun 19 '21 at 06:00
  • If so, I think it is not a proof. (I do not disagree the result, but I do not agree with it. I keep silent). We may stop here. Thanks for discussing. (By the way, I will not downvote this answer though I criticized your answer. So if someone downvotes you in the future, it is not me). – River Li Jun 19 '21 at 06:36
0

Promising method :

As in another topic we can split in two the problem as follow :

We take the log and stop the derivative at the first order and on $\left(\frac{11}{40},1\right)$ the derivative seems to be the product of two positives increasing function wich I call $a(x)$ and $b(x)$ we have :

$$b(x)=\left[x^{\frac{615-1000}{1000}}(x^{x}(x\ln^3(x)+x\ln^2(x)+\ln(x))+1)\right]$$

And :

$$a(x)=x^{\left(x^{x}-\frac{615}{1000}\right)}$$

And :

$$\frac{d}{dx}\ln\left(x^{x^{x^{x}}}\right)=a\left(x\right)\cdot b\left(x\right)$$

So we can take the log again and again in the general case.Warning to the prohibition of calculus.

Edit :

A good substitution for $b(x)$ is $x=e^y$.Have a look at the first derivative .

It seems we have on $y\in(\ln(0.275),0)$ :

$\frac{d}{dy}b(e^y)\geq g(y)$

Where :

$g(y)=\frac{\left(e^{-\frac{385y}{1000}}\left(1000e^{3y}+e^{3y}(1000(y+1)^{2}+615)y^{3}+e^{3y}(1000(y+1)^{2}+3615)y^{2}+e^{3y}(1000\left(y+1\right)(y+3)-385)y-385e^{3y}\right)\right)}{1000}$

Or $g(y)$ equal to :

$$\frac{\left(e^{-\frac{385y}{1000}}e^{3y}\left(1000y^{5}+3000y^{4}+4615y^{3}+8615y^{2}+2615y+615\right)\right)}{1000}$$

Edit : Some ideas to show that $a(x)$ is increasing on $(0,1)$

The third derivative of $a(x)$ seems to be positive so the derivative of $a(x)$ admits a minimum .So the derivative seems to be convex as the sum of two convex function we have :

$$\left(\frac{\left(x^{x}-0.615\right)}{x}\right)''>0$$

And

$$\left(x^{x}\left(\ln^{2}\left(x\right)+\ln\left(x\right)\right)\right)''>0$$

An easier way is to substitute $x=e^y$ and then differentiate . In the derivative the embarrassing part is :

$$j(y)=e^{\left(\left(e^{y}+1\right)\cdot y\right)}\left(y\left(y+1\right)\right)+e^{y\cdot e^{y}}-0.615$$

It seems we have on $y\in(-1,0)$ :

$$j(y)\geq e^{\left(\left(y+2\right)\cdot y\right)}\left(y\left(y+1\right)\right)+e^{\left(ye^{y}\right)}-0.615\geq 0$$

Where we use the very famous inequality and $x$ a real number:

$$e^x\geq x+1$$

Again it seems that the functions :

$t(y)=e^{\left(\left(y+2\right)\cdot y\right)}\left(y\left(y+1\right)\right)$

And

$k(y)=e^{\left(ye^{y}\right)}$

Are convex for $y\in(-1,0)$ .So we can use the tangent line .

  • For easy reading, you may simplify $b(x)$ to reduce the numbers of parentheses, e.g. $$b(x) = x^{\frac{623-1000}{1000}}\left(x^x(x\ln^3 x+ x\ln^2 x + \ln x) +1\right).$$ – River Li Jun 24 '21 at 00:32
  • @RiverLi No problem . Do you see anything else ?Thanks in advance for your precious advices . – Miss and Mister cassoulet char Jun 24 '21 at 12:51
  • Why do you keep the parentheses outside? Too many parentheses look not nice. You may see Vasc's paper ("Proofs of three open inequalities with power-exponential functions"). For example, $\ln x$ is nice, not necessarily $\ln(x)$. Sometimes use brackets such as $x\ln x - \ln[1 + (x - 1)\mathrm{e}^{x-1}]$. – River Li Jun 24 '21 at 13:49
  • Your original: $$b(x)=\left(x^{\frac{623-1000}{1000}}(x^{x}\ln(x)(x\ln(x)(\ln(x)+1)+1)+1)\right).$$ You use 7 parentheses/brackets. My suggestion: $$b(x) = x^{\frac{623-1000}{1000}}\left[x^x(x\ln^3 x+ x\ln^2 x + \ln x) +1\right].$$ Only 2 parentheses/brackets. – River Li Jun 24 '21 at 13:53
  • @RiverLi have you some ideas to show it or advices ? Thanks ! – Miss and Mister cassoulet char Jun 25 '21 at 12:29
  • I gave comment on the OP several days ago. – River Li Jun 25 '21 at 13:41
  • @RiverLi Oh well , but it seems easier to do what I do no ? – Miss and Mister cassoulet char Jun 25 '21 at 14:49
  • @ErikSatie would you please edit the statement "It seems we have on ∈ (ln[0.275],0)". I'd say "For y in (x1,x2) it seems we have ..." but my real question is what does "seems" here mean? Is this a conjecture or are you making a declaration or what? – Squirtle Jun 25 '21 at 15:14
  • 1
    @Squirtle I work with Desmos free software so all of this are conjectures with a very very high probability to be true .Other questions ? – Miss and Mister cassoulet char Jun 25 '21 at 15:20
  • 1
    Very good. Cool, thank you – Squirtle Jun 25 '21 at 16:36
  • @ErikSatie But you may improve your writing. It is hard to read your proof. For example, we may write $\frac{\mathrm{d}}{\mathrm{d} x}\ln\left(x^{x^{x^{x}}}\right)=a\left(x\right)\cdot b\left(x\right)$ where $b(x) = x^{\frac{623-1000}{1000}}\left[x^x(x\ln^3 x+ x\ln^2 x + \ln x) +1\right]$ and $a(x) = \cdots $. But you wrote: $b(x) = \cdots$ and $a(x) = \cdots$ and $\frac{d}{dx}\ln\left(x^{x^{x^{x}}}\right)=a\left(x\right)\cdot b\left(x\right)$. – River Li Jun 26 '21 at 00:36
  • @ErikSatie Do you want to prove that $\frac{\mathrm{d^2}}{\mathrm{d} x^2}\ln\left(x^{x^{x^{x}}}\right) \ge 0$? It is not clear. You just said "So we can take the log again and again in the general case.Warning to the prohibition of calculus." – River Li Jun 26 '21 at 00:39