1

I am trying to find what $E(X^{0})$ is, if $X$ is a random variable. My approach is to first find $X^{0}$. I know that for any value $x^0$, if $x \neq 0$, then $x^0 = 1$. However, if we have a random variable that can take values of zero, then wouldn't the expectation be undefined at the point $X = 0$? But, I know that for any random variable $X$, $E(X^{0}) = 1$. I am confused what I am missing here. Could anyone lend some tips? Thanks!

user123276
  • 3,445

2 Answers2

0

Keep in mind the definition of expected value: $E[X^k] = \int\limits_{-\infty}^\infty{x^kf(x)dx}$ for a continuous random variable with probability density function $f(x)$. That is, the expected value is the area under a curve, and the area under a single point on a curve is always 0. So the fact that a single point (or even any finite number of points) on the curve are undefined does not reduce or interfere with the area under the curve. To be really precise, we could rewrite the integral as the sum of the limits of two integrals, like so: $E[X^0] =\lim\limits_{a \to 0}{\int\limits_{-\infty}^a{x^0f(x)dx}} + \lim\limits_{b \to 0}{\int\limits_{b}^\infty{x^0f(x)dx}}$. But the value of the integral is not changed by the undefined point.

0

By definition, $$\mathbb E[X^0] = \int_\Omega X^0\ \mathsf d\mathbb P = \int_\Omega \ \mathsf d\mathbb P = \mathbb P(\Omega)=1. $$ Alternatively, using the "law of the unconscious statistician," that is: $$\mathbb E[g(X)] = \int_{\mathbb R} g(x)\ \mathsf dF(x), $$ where $F$ is the cumulative distribution function of $X$, setting $g(x)=x^0$ we find that $$\mathbb E[X^0] = \int_{\mathbb R} x^0\ \mathsf dF(x)=\int_{\mathbb R} \ \mathsf dF(x) = 1. $$

Math1000
  • 36,983