0

I was wondering if there is another way to determine the sign of the derivative of the function $f(x)$. Sometimes it is quite difficult (and lengthy) to derive the derivation for $\partial f/ \partial x$.

I just need a qualitative answer, i. e. I just want to know the sign of the derivation, not a quantitative value.

I edited my post (14.11.) and tried to derive a new proof:

We have a function with the following form:

$$f(x)= {{(1+g(x))^T-(1+g(x))^{t-t_0}} \over {(1+g(x))^T-1} } $$

and we know the following things:

${\partial g(x) \over \partial x} <0$
$T>t>t_0>0$ and are non-negative integers.

Proposition: $f(x)$ is decreasing with x

(This is of course only true for the additional statements above for $g(x)$ and $T, t, t_0$)

Proof:

For the proof we use the following logic: The decrease in the nominator by a small increase in $x$ must be smaller than the decrease in the denominator. Otherwise the fraction would not be decreasing with an increase in $x$ and the proposition would not be true. If for example the decrease in the nominator would be larger than the decrease in the denominator the fraction would increase and the proposition can not be true.

For that statement to be true the following inequality for a small increase in $x$ denoted by $\Delta$ must always hold:

$$((1+g(x))^T-(1+g(x))^{t-t_0})-(1+g(x+\Delta))^T+(1+g(x+\Delta))^{t-t_0}<((1+g(x))^T-1)- ((1+g(x+\Delta))^T-1)$$

Reducing the inequality we can see that:

$$-(1+g(x))^{t-t_0})+(1+g(x+\Delta))^{t-t_0}<0$$ which must be true all the time because $g(x)>g(x+\Delta)$ (see the statement above for ${\partial g(x) \over \partial x} <0$

PAS
  • 147
  • Do you have access to a plotted graph of the function? Then you can see if it is increasing (therefore $f'>0$) or decreasing ($f'<0$). Sometimes increasing/decreasing follows from the context. – M. Winter Oct 17 '17 at 14:56
  • Yes I have a graph for the function but this is not really a "proof" which I need. – PAS Oct 17 '17 at 15:30
  • Can you show us a specific function? Because I think there might be not much tools available for the general case. You can check $f(x+\epsilon)-f(x)$ to get an idea of what the derivate's sign might be. Sometimes it is possible to "factor out" parts which cannot influence the functions monotonicity. But the success of all these methods depends on the specific function. – M. Winter Oct 17 '17 at 15:33
  • I edited my original post – PAS Oct 17 '17 at 15:53
  • Can I assume that $g(x)>-1$ (or even $g(x)>0$) to carelessly exponentiate? Or do you assume $T,t,t_0\in\Bbb N$? I aske because otherwise $f(x)$ might not be defined everywhere. – M. Winter Oct 18 '17 at 08:44
  • You can assume that $g(x)$>0 and $T,t,t_0$ are positive natural numbers. – PAS Oct 18 '17 at 13:21
  • And you have no information on $g$ other than $g(x)>0$ and $g'(x)<0$? I think the result might highly depend on the exact function. – M. Winter Oct 19 '17 at 15:34
  • I was able to find some additional information on $g(x)$ and I tired to formulate a proof that can be found in the post above. – PAS Oct 26 '17 at 14:35
  • Sorry it took quite some time to edit the post :) – PAS Oct 26 '17 at 16:02
  • @M.Winter I don't want to disturb you but is this proof possible? – PAS Nov 13 '17 at 15:36
  • I worked on a proof myself some time ago but it failed. I am actually not sure if the following step is valid: if you have $f(x)=h_1(x)/h_2(x)$ and $h_1(x)-h_1(x+\Delta)<h_2(x)-h_2(x+\Delta)$ for decreasing $h_i$, then $f(x)$ is decreasing. This is where I see the problem in your proof. – M. Winter Nov 13 '17 at 15:53
  • Thank you for your answer. I will look over the proof again this evening. – PAS Nov 13 '17 at 16:01
  • I edited my post and formulated a new and hopefully clearer proof. – PAS Nov 14 '17 at 08:38
  • I do not think your proof was unclear, just that you used (and still use) a false claim! Check out my pretty ungly and complicated long answer. You are right, but the reasoning might be complicated. Maybe I am just blind and there is an easy way, but at least now we know that you are right. And I will think about an easier way when I find time. – M. Winter Nov 14 '17 at 10:34

1 Answers1

1

I am afraid there is no easy way to show this for general $t,t_0,T$ and $g$. But here is a (long) way to show that $f$ is always decreasing for the assumptions you have given.


Let's define $h(x):=1+g(x)$, $\alpha:=-t+t_0+T$ and $\beta:=T.$ So we can work with this easier looking expression:

$$ f(x)= \frac{[h(x)]^T-[h(x)]^{t-t_0}}{[h(x)]^T-1}= \frac{1-[h(x)]^{t-t_0-T}}{1-[h(x)]^{-T}}= \frac{1-[h(x)]^{-\alpha}}{1-[h(x)]^{-\beta}}=:\frac{u(x)}{v(x)}. $$

Note that we have $\beta>\alpha>0$, $h(x)>1$ and $h'(x)<0$. We can simply apply the quotient rule to find

$$f'(x)=\frac{u'v-v'u}{v^2}<0\qquad\text{if and only of}\qquad u'v<v'u.$$

and because $u,v>0$ and $u',v'<0$, this is equivalent to ask for $f(x)=u/v<u'/v'$. In the case of above function this results in the claim

$$f(x)<\frac{\alpha [h(x)]^{-\alpha-1}\cdot h'(x)}{\beta [h(x)]^{-\beta-1}\cdot h'(x)}=\frac\alpha\beta [h(x)]^{\beta-\alpha}.$$

This means that $f$ is decreasing in some $x$ if and only if this holds true at this specific point. So in order to prove that $f$ is always decreasing, we need to show that for all real numbers $h>1$ and $\beta>\alpha>0$ we have

$$ \frac{1-h^{-\alpha}}{1-h^{-\beta}}<\frac\alpha\beta h^{\beta-\alpha}\quad\implies\quad\frac1\alpha(h^\alpha-1)<\frac1\beta(h^\beta-1). $$

This means we are asked to show that $\gamma(x)=(h^x-1)/x$ is always increasing on $(0,\infty)$ for $h>1$. Here again we need some dirivative magic:

$$\gamma'(x)=\frac{h^x\log h\cdot x-h^x+1}{x^2}>0\qquad\text{if and only if}\quad h^{-x}>1-x\log(h).$$

But this condition is always true since $h^{-x}$ is a convex function and $1-x\log(h)$ is its tangent at $x=0$. Convex functions always dominate their tangents.

M. Winter
  • 29,928
  • Are you sure that your transformed expression is correct? Somehow I can't quite get the correct answer. The rest looks fine. – PAS Nov 17 '17 at 14:07
  • @PAS Which one exactly? – M. Winter Nov 17 '17 at 14:08
  • I'm very sorry. You are totally correct. I should take a break. – PAS Nov 17 '17 at 14:10
  • @PAS I edited for clarification. I divided by $[h(x)]^T$ on the top and on the bottom. Does this look okay? – M. Winter Nov 17 '17 at 14:13
  • Yes, thank you. – PAS Nov 17 '17 at 14:16
  • I still don't quite get it. Why do you switch the greater-than sign after the sentence: "So in order to prove that ff is always decreasing, we need to show that for all real numbers h>1h>1 and β>α>0β>α>0 we have"? I thought we need to have u/v>u'v'? – PAS Nov 17 '17 at 15:42
  • @PAS The signs was the wrong way, but the next equation in the same line is still correct. I just flipped it too early. – M. Winter Nov 17 '17 at 15:50
  • I think I found the mistake. There is a mistake in the assumption with $u'v<v'u$. $v'=Th'h(x)^{T-1}$ which is negative and not positive because h'(x) is negative because h(x)=1+g(x) and g'<0. Therefore we have to switch the sign when we divide by v' which will make the proof correct again. I will hopefully complete the proof tomorrow. – PAS Nov 18 '17 at 15:09
  • @PAS You are absolutely right! In the end you answered your own question ;). I edited my post. – M. Winter Nov 23 '17 at 08:56
  • I was looking for another proof and came across our conversation and I reproduced the proof from scratch. Everything looks really great but I have some problems deriving the last inequality with $h^x>1/xlog ~ h+1$. Is there anything I'm missing? – PAS Feb 27 '18 at 08:55
  • The last inequality is not correct. $h^x log ~h x -h^x+1>0$ is $h^x>1/(1-x log ~h)$ which is true but pretty hard to proof. I was thinking about a logarithmic inequality but I can't quite transform the inequality to construct the last part of the proof which is annoying because it really completes the proof. – PAS Feb 27 '18 at 14:01
  • @PAS The last inequality is indeed wrong and should be $$h^x>\frac1{1-x\log h}.$$ This only holds for $x>1/\log(h)$, but I will need some time to adjust the post. I am also not sure yet, what are the consequences for your original question. – M. Winter Feb 27 '18 at 14:03
  • The consequences for my original post would be that $1/\alpha...<1/\beta...$ would not be true which is the last statement in line 4 which needs a proof. – PAS Feb 27 '18 at 14:07
  • @PAS Not in general true, but certainly still true for values $\alpha,\beta>1/\log(h)$. This would probably show where the function is decreasing instead of stating that it is decreasing everywhere. – M. Winter Feb 27 '18 at 14:11
  • The function $(h^x-1)/x$ is increasing for h>1 and $x(0,infinity)$. But somehow this is quite difficult to proof using the derivation of the function. Maybe an easier way would be to simplify the restrictions a little bit. Recall that the original post states "$T, t, t_0$ are non-negative integers". Therefore we can say that $x$ is a non-negative integer which makes the proof (maybe) easier using mathematical induction? – PAS Feb 27 '18 at 15:07
  • @PAS $h^x>1/(1-x\log(h))$ was wrong because I forgot to flip the inequality. However, I think I fixed it by applying some logic about convex functions. – M. Winter Feb 27 '18 at 15:32
  • Wow that's a very elegant solution to the problem, thank you very much. Your are right with your statement for the dominance of the convex function and its tangent (see for example: https://math.stackexchange.com/questions/1761801/proving-that-the-tangent-to-a-convex-function-is-always-below-the-function) But I was wondering if I have to proof this statement itself. Where can I find a general proof, maybe in a book? – PAS Feb 27 '18 at 15:54
  • @PAS Nice to hear :). It is a very well known fact about convex function. You can read about it on the Wikipedia article on convex functions. It is property 4, and it even has a link to a source. However, the source is a book. But this result might be contained in any book on convex analysis/optimization anyway. As I said, it is pretty classic. By the way, what is wrong with th proof in your linked question? You are looking for a particularly simple one? – M. Winter Feb 27 '18 at 17:04
  • I was looking through some books on convex and concave optimization problems and couldn't find an adequate proof (for example "A first course in Optimization Theory" by Sundaram). But I'm glad to hear that this statement is common knowledge. I don't have a problem with the proof in the link even though it is pretty long and a simple one would be better. I was just wondering if it is such a classic statement that if it is stated or mentioned somewhere that it requires no proof in the first place. Because I didn't know about this statement even after working through some books on optimization – PAS Feb 28 '18 at 07:55