Assume that $f\in C^1([0,\infty))$, $f'$ is monotonic on $[0,\infty)$ and $\lim\limits_{x\to\infty} f(x)=l$ is finite. If $a,b>0$, prove that; $$\displaystyle\int_0^\infty\frac{f(bx)-f(ax)}{x}dx=(l-f(0))\ln\frac{b}{a}$$
The trick is clear for me, I have to write is as double integral, i.e. the function $\frac{f(bx)-f(ax)}{x}$ as an integral, obviously with limits $a$ and $b$. For example
$\displaystyle\int_a^bf'(xy)dy$$\quad$(Here is $f$ differentiated with respect to $y$)
works.
Now if I assume that, $\displaystyle\int_0^\infty\Big(\displaystyle\int_a^bf'(xy)dy\Big)dx=\displaystyle\int_a^b\Big(\displaystyle\int_0^\infty f'(xy)dx\Big)dy$
and $\displaystyle\int_0^\infty f'(xy)dx=\frac{l-f(0)}{y}$ $(\bigstar)$
then $\displaystyle\int_a^b\frac{l-f(0)}{y}=(l-f(0))\ln\frac{b}{a}$ gives the correct result.
MY QUESTIONS
1)Why is the line with $(\bigstar)$ correct ? Because $f'(xy)$ is differentiated w.r. to $y$ but if we integrate w.r.to $x$ why do we get $f$ again ?
2)Why can we change the order of integration, We have to make use of the monotonicity of $f'$, I guess $\lim\limits_{x\to\infty}f'(x)=0$, but this is not sufficient.