Prove that if $f(x), g(x)$ are defined on $[a, \infty)$ such that $f(x)$ is decreasing and $$\lim_{x \to \infty}f(x)=0$$ and $g(x)$ is continuous in its domain and the integral satisfies $$\int_a^xg(t)\,dt<M$$ for every $x$ in the above given interval, then prove that $$\int_a^xf(t)g(t)\,dt$$ converges. $$$$I tried the generalized mean value theorem but it didn't work.
Asked
Active
Viewed 134 times
0
-
2Please show your efforts on the question to avoid it being closed or heavily downvoted. See: [ask] a good question – sai-kartik May 29 '20 at 18:30
-
Proved here along with Abel's test. You need the Second Mean Value Theorem for integrals. – RRL May 29 '20 at 18:48