I am having a trouble solving a derivative of a Riemann integral while trying to obtain a distribution of a variable being a function of another random variable.
Let $ X $ be a random variable with density function $ f_X $ and $ Y = g(X) = aX+b $.
The calculations up to the point of confusion:
$ P( Y \leq y ) = P( aX+b \leq y ) $ which makes:
$ P(X \leq (y-b)/a ) $ for $ a > 0 $
$ 1 - P(X \leq (y-b)/a ) $ for $ a < 0 $
We have a general rule:
$ \displaystyle\int_{a(x)}^{b(x)} f(y,x)\,\mathrm{d}x = b^\prime(x) f( b(x), x ) - a^\prime(x) f( a(x), x ) + \displaystyle\int_{a(x)}^{b(x)} f_x^\prime(y,x)\,\mathrm{d}t $
Which in our case resolves to:
$ f( a(x), x ) = 0 $ since $ a(x) = -\infty $
$ - a^\prime(x) f( b(x), x ) = |a|^{-1} f_X( (y-b)/a ) $
But what does $ \displaystyle\int_{a(x)}^{b(x)} f_x^\prime(y,x)\,\mathrm{d}t $ account for? It seems from my answer sheet that the whole expression equals to $|a|^{-1} f_X( (y-b)/a )$, but I do not understand why.