Assume that $f$ is continuous and differentiable. Futher, let $g(r,t)\geq0$ be such that for $t>x$ $$ \lim_{r \to \infty} \frac{g(r,t)}{g(r,x)} = 0. $$
Show that $$ \lim_{r \to \infty} \frac{\int_x^{\infty} (f(x)-f(t)) g(r,t)dt}{\int_x^{\infty} (x-t) g(r,t)dt} = f'(x), $$ where all the functions are also sufficiently integrable for the claim to make sense.
Some notes:
From the assumptions it follows that for $t\geq x$ $$ \lim_{r \to \infty} \frac{g(r,t)}{g(r,x)} = \delta_x, $$ where $\delta_x$ is the Dirac delta. Then we can divide the denominator and the numerator by $g(r,x)$. But I can't immediately see why that would reduce things to be the $f'(x)$ since the integrals are taken separately in the denominator and the numerator. Are some additional assumptions needed?
Just my first ideas
– mick Nov 17 '23 at 00:19