I have just recently learned calculating Expected Value via Darth Vader Rule which states that $$E(X) = \int_{0}^{\infty}1-F(X) dx $$
I decided to try this for a uniform random variable $X$ defined over $[3,8]$ which, intuitively, $E(X)$ should be 5.5. However, when I tried this via Darth Vader Rule, I got a different result:
Assuming $X$ is uniform over $[3,8]$, its PDF is $f(x) = \dfrac{1}{5}$ and its CDF is $F(X)=\dfrac{x-3}{5}$
$$E(X)=\int_{3}^{8}1-\left(\dfrac{x-3}{5}\right) dx = 2.5$$
Which is a bit unexpected. I have tried this for different uniform distribution $U(a,b)$ and it seems that instead of $E(X)=\dfrac{a+b}{2}$, the Darth Vader always produces $E(X)= \dfrac{a+b}{2}-a$.
What did I do wrong here? Any comment/insight would be much appreciated.
Also I did try integrating from 0 to 3 as you suggested. Still turns into 3.9.
– Saito Nov 12 '21 at 02:53