1

(a) Let $X$ be an exponential random variable with parameter $\lambda$. Find the moment generating function of $X$.
(b) Suppose a continuous random variable $Y$ has moment generating function $M_Y(s)= \frac{\lambda^2}{(\lambda-s)^2}$ for $s<\lambda$ and $M_Y(s)+\infty$ for $s\ge \lambda$.
Find the probability density function of $Y$.

So (a) is simple. Using $M_X(s)=E(e^{sx})=\int_0^\infty e^{sx}\lambda e^{-\lambda x }dx = \frac{\lambda}{\lambda-s }$ if $s<\lambda$ and $+\infty$ for $s\ge \lambda$.

For (b) I do see a relation between X and Y, that is, $M_Y(s)=(M_X(s))^2$.

Some of my ideas. If I derive $M_Y(s)$ I can obtain the expected value, but I am not sure as to how this can be helpful. I know that this requires manipulation of $M_X(s)$ but I also know that I cannot square the pdf of $X$. I would appreciate some help.

thisisme
  • 635

2 Answers2

2

Recall that if $X$ has MGF $M_X(s)$, then $Y = \sum_{i=1}^n X_i$ where each $X_i \sim X$ is IID, then $$M_Y(s) = (M_X(s))^n.$$ The proof is simple: $$M_Y(s) = \operatorname{E}[e^{tY}] = \operatorname{E}[e^{t(X_1 + \cdots + X_n)}] = \operatorname{E}[e^{tX_1} e^{tX_2} \cdots e^{tX_n}] \overset{\text{ind}}{=} \prod_{i=1}^n \operatorname{E}[e^{tX_i}] = \prod_{i=1}^n M_{X_i}(s) = (M_Y(s))^n.$$ Therefore, if $M_Y(s) = M_X(s)^2$, $Y = X_1 + X_2$ where $X_1, X_2 \sim \operatorname{Exponential}(\lambda)$; that is to say, $Y$ is gamma distributed with shape $2$ and rate $\lambda$ (if $X$ was parametrized by rate as well).

For additional details, please refer to the following post: Gamma distribution out of sum of exponential random variables.

heropup
  • 135,869
  • Ahh, I had forgotten this proof. It now makes sense as to why Y is the sum of $X_1$ and $X_2$. Thank you for that. I am still, however, confused as to how you figured out that the pdf of $Y$ is gamma distributed. – thisisme May 23 '17 at 01:25
  • The sum of exponential distributions with same parameter is erlang, which is a special case of gamma. Its a well known fact. – Batman May 23 '17 at 01:48
1

If you add two independent random variables, you convolve their PDF's (i.e. if the densities are $f, g$, their sum has density $h(t) = \int f(s)g(t-s) ds$) or multiply their moment generating functions.

Batman
  • 19,390
  • So its $M_X(s)M_Y(s)=\frac{\lambda^3}{(\lambda-s)^3}$ ? If this is correct, can you please explain why multiplying their moment generating functions works? I am not really understanding it – thisisme May 23 '17 at 01:14
  • No, its $M_Y(s)=M_{X_1}(s)M_{X_2}(s)$ where $X_1,X_2$ are independent random variables with an identical distribution to $X$ (ie exponential). Then $Y$ is the sum of two independent exponential random variables; $Y:=X_1+X_2$. What distribution does this have? – Graham Kemp May 23 '17 at 01:20
  • Assume $X,Y$ independent. Then, $M_{X+Y}(s) = E[e^{s(X+Y)}] = E[e^{sX}e^{sY}]$ whcih by independence is $E[e^{sX}] E[e^{sY}]= M_X(s) M_Y(s)$. As for convolving them, just write down $P(X+Y \leq s)$ and differentiate with respect to $s$ to see that. – Batman May 23 '17 at 01:24
  • I see, thank you for your help! – thisisme May 23 '17 at 01:31