2

Let $X_1$ and $X_2$ are independent $N(0, \sigma^2)$ which means (mean = 0, variance = $\sigma^2$) random variables. What is the distribution of $X_1^2 + X_2^2$?

My approach is that $X_1\sim N(0, \sigma^2)$ and $X_2\sim N(0, \sigma^2)$.

Transforming $X_1$ and $X_2$ into standard normal, $X_1/\sigma\sim N(0, 1)$ and $X_2/\sigma\sim N(0, 1)$.

Then $X_1^2/\sigma$ and $X_2^2/\sigma$ have chi-squared distribution with 1 degree of freedom.

Then I found the moment-generating function for $X_1^2$ and $X_2^2$;$$m_{X_1^2} = (1-2t)^{-1/2}$$ and $$m_{X_2^2} = (1-2t)^{-1/2}$$

So the moment generating function for $X_1^2 + X_2^2$ is $$m_{X_1^2}(t) m_{X_2^2}(t) = (1-2t)^{-2/2}$$

So $X_1^2 + X_2^2$ has a chi-squared distribution with 2 degrees of freedom. My question can I treat $X_1^2/\sigma$ + $X_2^2/\sigma$ as $X_1^2$ + $X_2^2$ like I did above?

  • You mean $\frac{X_1^2}{\sigma^2}$ and $\frac{X_2^2}{\sigma^2}$ have chi-squared distributions with 1 degree of freedom. – afedder May 14 '14 at 21:11
  • yes, it is what I mean – user111548 May 14 '14 at 21:12
  • 7
    This question is a duplicate of this one by a different user or possibly the same user posting under a pseudonym because the work shown is exactly the same. – Dilip Sarwate May 14 '14 at 21:13
  • So try deriving the moment-generating function with these scaled random variables (by the factor $\frac{1}{\sigma^2}$)...shouldn't be too hard if you already know how to derive the moment-generating function normally. – afedder May 14 '14 at 21:14
  • @afedder: Do you think it should be $X_1/\sigma$~$N(0, 1)$ and $X_2/\sigma$~$N(0, 1)$ or $X_1/\sigma^2$~$N(0, 1)$ and $X_2/\sigma^2$~$N(0, 1)$.? – user111548 May 14 '14 at 21:16
  • $X_1^2/\sigma^2$ This should be a chi-squared distribution, right – user111548 May 14 '14 at 21:19
  • It should be $$\frac{X_1}{\sigma} \sim N(0,1), \frac{X_2}{\sigma} \sim N(0,1) ,,$$ so you then to obtain chi-squared distributions with 1 degree of freedom, square each of these. – afedder May 14 '14 at 21:22
  • Recall that the sum of chi-squared random variables with 1 degree of freedom is chi-squared with degrees of freedom equal to the number of summands – afedder May 14 '14 at 21:23
  • what are summands ? – user111548 May 14 '14 at 21:24
  • look to my answer – afedder May 14 '14 at 21:27
  • @user111548 : Please: Write $X\sim N$, not $X$~$N$. Remember that TeX was invented precisely for the purpose of handling all sorts of things like this. It's strange to think it's incapable of what it was invented for. I edited your question accordingly. – Michael Hardy Jun 02 '14 at 02:51

5 Answers5

3

Hint: Recall that if $X_1,X_2,...,X_n$ are independent and identically distributed as $\chi_{1}^{2}\,$, then $$\sum_{i=1}^{n} X_{i} \sim \chi_{n}^{2}$$ and that if $Z \sim N(0,1)\,$, then $$Z^2 \sim \chi_{1}^{2} \,\,.$$ Additional hint/spoiler: By the above, for independent $X,Y \sim \chi_{1}^{2}$ , it follows that $X+Y \sim \chi_2^2 \,.$ It can be shown that $X+Y \equiv W$ for $W \sim \text{Exp}(\frac{1}{2})$. You should verify this and then you are basically finished. To do this, prove that $X \sim \chi_n^2$ has density given by $$f(x \mid n) = \frac{1}{2^{n/2}\Gamma(n/2)}x^{n/2-1}e^{-x/2} \,\,\,\,\text{for $x>0$}\,.$$ Then, see that the density of $X \sim \chi_2^2$ is $$f(x \mid 2) = \frac{1}{2}e^{-\frac{1}{2}x} \,\,\,\,\text{for $x>0$}\,,$$ and this is the density of a random variable from an exponential distribution with parameter $\frac{1}{2}$. $$$$ Another relevant derivation: Suppose $X$ is a random variable and $Z = aX$ for some $a \in \mathbb{R} \backslash \{0\}$. Then the cumulative distribution function of $Z$ is given by $$F_Z(z) =\mathbb{P}(Z \leq z)=\mathbb{P}(aX \leq z)=\mathbb{P}\left(X\leq \frac{z}{a}\right)=F_X\left(\frac{z}{a}\right)\,\,,$$ where $F_X$ is the cumulative distribution function of $X$. Now, we derive the density function of $Z$, which we will denote $f_Z$, in the case that $a>0$ (this is the only case that applies here since $\sigma^2>0$): $$f_Z(z)=\frac{\text{d}}{\text{d}z}F_Z(z)=\frac{\text{d}}{\text{d}z}F_X\left(\frac{z}{a}\right)=\frac{1}{a}f_X\left(\frac{z}{a}\right) \,,$$ where $f_X$ is the density function of $X$. Final hint: The above hints are in the order of usage. $$$$ The following is a solution based on the above hints for future readers' benefit. Notice that we can standardize $X_1$ and $X_2$, so that $$\frac{X_i - 0}{\sigma} = \frac{X_i}{\sigma} \sim N(0,1) \,\,\,\,\,\text{for} \,\,i=1,2\,\,.$$ It follows that $$\left(\frac{X_i}{\sigma}\right)^2 \sim \chi_1^2 \,\,\,\,\,\text{for}\,\,i=1,2\,\,,$$ so that $$\left(\frac{X_1}{\sigma}\right)^2 + \left(\frac{X_2}{\sigma}\right)^2=\frac{1}{\sigma^2}(X_1^2 + X_2^2) \sim \chi_2^2\,\,.$$ Also, it is not difficult to verify that a $\chi_2^2$ random variable is equivalent in distribution to an $\text{Exp}(\frac{1}{2})$ random variable, so $$\frac{1}{\sigma^2}(X_1^2 + X_2^2) \sim \text{Exp}\left(\frac{1}{2}\right)\,.$$ Now, let $X = \frac{1}{\sigma^2}(X_1^2 + X_2^2)$ and $a=\sigma^2>0$, so that $Z=aX=X_1^2 + X_2^2$, and apply the last hint. We know that the density function of $X$ is given by $$f_X(x)=\frac{1}{2}e^{-\frac{1}{2}x},$$ and it follows that the density function of $Z$ is $$f_Z(z)= \frac{1}{\sigma^2}f_X\left(\frac{z}{\sigma^2}\right)=\frac{1}{2\sigma^2}e^{-\frac{1}{2\sigma^2}z} \,\,.$$ We recognize this as the density for a random variable from an exponential distribution with parameter $\frac{1}{2\sigma^2}$. In other words, $$X_1^2 + X_2^2 \sim \text{Exp}\left(\frac{1}{2\sigma^2}\right)\,\,.$$ @DilipSarwate @FelixMarin @user3001408, you might be interested in this derivation.

afedder
  • 2,102
  • @DilipSarwate: That's why I said to the OP to check what will happen to the distribution if you scale by $1/\sigma^{2}$ using the MGF derivation – afedder May 14 '14 at 21:44
  • From the above, we know that $$\frac{1}{\sigma^{2}}(X_1^2 + X_2^2) \sim \chi_{2}^{2}.$$ This is certainly useful information, although it is not the entire solution. – afedder May 14 '14 at 21:51
  • So $cU \sim \chi_{2}^{2}$ for a constant $c>0$ and a random variable $U$. Can we figure out the distribution of $U$? The answer is, of course, at least in this case. – afedder May 14 '14 at 21:54
  • The key is to figure out how multiplying by a constant changes a chi-squared distribution in general. – afedder May 14 '14 at 22:01
  • @afedder: I am sill struck how to use $X_1^2/ \sigma^2$ to derive the moment generating function. Could you help on this? – user111548 May 17 '14 at 03:18
  • I will write it out and get back to you tonight or tomorrow...@Dilip Sarwate also had a good answer on your previous duplicate post using the double integration. This is how you should approach this type of question in general, but when I wrote the hint, I thought there might be a shortcut using the approach of transformations of a known distribution. If I find a better solution than what @Dilip Sarwate posted, I will let you know (maybe add another answer)...the other approach posted below with the delta and step functions is a little too complicated for my taste, but that works as well. – afedder May 17 '14 at 07:43
  • I am kind of lost at the end of his answer when he says $X_1^2 + X_2^2$...what is that going to be? – user111548 May 17 '14 at 08:30
  • 1
    He pretty much spoon fed you the answer...did the entire calculation, all you have to do is conclude the distribution. He tells you the cumulative distribution for a general exponential random variable with parameter $\lambda$...all you have to do is compare the constants in his answer to $\lambda$ in the general function and you will figure out the parameter of this exponential distribution (he has already confirmed for you that it is exponential, I wouldn't have even done this much after going through the calculation). Do some work on your own homework, man. @user111548 – afedder May 17 '14 at 09:05
  • Like I said, I might post another solution that might clarify things further for you. – afedder May 17 '14 at 09:06
  • How did he get from second equality to third equality? – user111548 May 17 '14 at 09:16
  • 1
    The second equality is using the fact that the random variables are independent and the density function of a normal distribution with mean $0$ and variance $\sigma^{2}$...the third equality is a shift from rectangular to polar coordinates - this type of substitution is learned in a multivariable calculus course (usually required for an intro probability theory course). Recall that $\text{d}x\text{d}y$ is substituted with $r\text{d}r\text{d}\theta$ and that $r^2 = x^2+y^2$. – afedder May 17 '14 at 09:25
  • can you expand your last sentence a little bit? – user111548 May 17 '14 at 09:56
  • Edit: not cumulative distribution function, he actually gives you the complement of the cdf, but since the cdf determines the distribution (one of many functions that does this), so does this complement. My process for identifying the distribution is, however, still as I have described – afedder May 17 '14 at 09:58
  • I'd prefer not to go into detail about polar coordinate substitutions over comment, you can look up very good explanations of this process on the internet...just type in "polar coordinates used in integration" – afedder May 17 '14 at 10:00
  • also, how does he get from fourth equality to fifth equality, especially the stuff outside the exp term disappear? – user111548 May 17 '14 at 10:04
  • if you know how to use the integration technique of substitution, make the substitution $u=-\frac{r^2}{2\sigma^2}$. – afedder May 17 '14 at 10:08
  • if I just match the stuff inside the exp, I get $\lambda = 1/2\sigma^2$...However, the pdf for exponential function is $f(y) = 1/\lambda e^{-y/\lambda} $ which means the last equality does not match with the exponential pdf...is this follow a exponential distribution? – user111548 May 17 '14 at 10:36
  • Look at my addition/edit to my answer and see if this helps you. You will be required to do some calculations using whatever method you see fit (cdf method, convolution integral, etc.) – afedder May 17 '14 at 10:53
  • First of all, you are using a different convention for the parameter of an exponential function than he is...he's not using $1/\lambda$ as the parameter, but $\lambda$ – afedder May 17 '14 at 10:56
  • This means the pdf is given by $$f(y)=\lambda e^{-\lambda y}$$ for $y>0$ – afedder May 17 '14 at 10:57
  • My question is after matching the stuff inside the exp, I got $\lambda = 1/2\sigma^2$ which then I put this into the pdf of exp which gives me $2\sigma^2 e^{-z/2\sigma^2}$...This is not equal to the stuff after the last equality does this mean this distribution does not follow a exponential distribution with parameter $1/2\sigma$?...I do not see how your help can help this... – user111548 May 17 '14 at 10:59
  • This doesn't really matter though, you don't need to consult the pdf of an exponential random variable at all here. All you need to do is compare the constants, this is sufficient since the survival function (complement of cdf) determines the distribution. – afedder May 17 '14 at 11:00
  • What do you mean in your last answer after "since survival...? – user111548 May 17 '14 at 11:02
  • If the cdf of a RV $X$ is denoted $F(x)=P(X \leq x)$, then what is called the survival function is given by $S(x) = 1 - F(x) = P(X > x)$. He has given you the survival function (complement of cdf). You can also do what he did, except deriving the cdf rather than the survival function of $X+Y$ – afedder May 17 '14 at 11:06
  • If I understand you correctly, I should use $1 - 2\sigma e^{1/2\sigma}$ to find the pdf? If so, why we do not need match the constants? – user111548 May 17 '14 at 11:08
  • You are not understanding me correctly at all - YOU DON'T NEED THE PDF, YOU DON'T NEED TO DERIVE THE PDF (with his solution, however, you might with mine)...you have all you need with exactly what he has written – afedder May 17 '14 at 11:11
  • If I do not need to pdf, I am wondering what is the stuff after the last equality? How do you know that follows a exponential distribution? – user111548 May 17 '14 at 11:15
  • The expression that @Dilip Sawarte has derived is NOT the pdf of the random variable $X^2+Y^2$, it is the survival function of $X^2+Y^2$. Comparing constants with the general survival function for an exponential random variable with parameter $\lambda$ allows you to recognize the distribution of $X^2+Y^2$..this is what your original question was – afedder May 17 '14 at 11:15
  • Is the cdf version a similar length? – user111548 May 17 '14 at 11:19
  • for future reference, the pdf of a random variable $X$ is $P(X = x)$ for some $x$ in the domain of $X$...he didn't calculate $P(X^2 + Y^2 = z)$, he calculated $P(X^2 + Y^2 > z)$. – afedder May 17 '14 at 11:19
  • Yes, it is, just replace $>$ in the region that appears below the integral with $\leq$ and then change your computations accordingly – afedder May 17 '14 at 11:21
  • is that still involved double integration, change from rectangle to polar coordinates? – user111548 May 17 '14 at 11:23
  • I haven't worked it out, but I imagine so. There are other ways to compute the integral that he used, but the method with polar coordinates is one of, if not the easiest way to do it – afedder May 17 '14 at 11:25
  • thanks...I have a more question on my other post listed below...JPi has answer my questions but I still have several question remains. My question over there is how to find $f_X(g^{-1}(y))$? If you have time, please take a look and we can chat under that post. http://math.stackexchange.com/questions/792554/find-the-pdf-of-prod-i-1n-x-i-where-x-is-are-independent-uniform-0-1 – user111548 May 17 '14 at 11:30
  • before getting there, just want to make sure $X^2 + Y^2$ follows an exponential distribution with parameter $1/2\sigma^2$, right? – user111548 May 17 '14 at 11:36
  • I have answered two of your questions so far on this forum and you only accept/upvote answers after I have spent three hours taking you step by step through the problem...that's not what this site is for, it's for learning and helping others help themselves. You seem more interested in someone writing solutions for you. I can tutor you for pay, of course. – afedder May 17 '14 at 11:37
  • Yes, that is correct – afedder May 17 '14 at 11:38
  • My suggestion is to derive the answer to this problem in another way than the method of @DilipSarwate for practice...first, derive the pdfs of the $X_i^2$ using the cdf method or other means (like transformations of RVs) (note: they are the same) and then use the convolution integral to find the pdf of the sum – afedder May 17 '14 at 11:40
  • This will help you fully understand what is going on and give you more practice with a fairly complicated transformation for a beginner. – afedder May 17 '14 at 11:41
  • My personal method (not everyone likes this) is to find the distribution of $Z= (X_1/\sigma)^2 + (X_2/\sigma)^2$...this is $\chi_2^2$ from what I said before. Then check what the pdf of a linear transformation of a chi-squared random variable with a general $n$ degrees of freedom looks like (this can be done in many ways). Using this information, you can conclude the distribution of $\sigma^2 Z = X_1^2 + X_2^2$, which is what you wanted. – afedder May 17 '14 at 11:47
  • How does this approach give you that $X_1^2 + X_2^2$ follows a exponential distribution with parameter $1/2\sigma^2$? – user111548 May 17 '14 at 12:29
  • Use the hints I added to my answer, I need to go to bed, it's 8:30 am here. – afedder May 17 '14 at 12:32
  • Here's my final parting thought process to you: (1) Find the distribution of $(X_1/\sigma)^2 + (X_2/\sigma)^2$. (2) Call this sum a random variable $Z$. (3) Use my third hint to find the distribution of $aZ$ for $a=\sigma^2$. This will give you the pdf you want. Remember, an exponential random variable with parameter $\lambda$ has pdf $$f(y)=\lambda e^{-\lambda}$$ and recognize this form in your derivation. – afedder May 17 '14 at 12:38
  • No, I've stayed up the whole night helping you. Need to get a little sleep. – afedder May 17 '14 at 12:39
  • I REALLY appreciate that, you are one of the most patient helper! Our world will be much better with more people like you! Good sleep! – user111548 May 17 '14 at 12:39
  • No problem, just hopefully you understand it better! – afedder May 17 '14 at 12:40
  • The pdf should read $f(y)=\lambda e^{-\lambda y}$ as well, above. Thanks for the kind words. – afedder May 17 '14 at 12:51
  • @afedder: I think I am really close... I just need to make sure I know how to prove several assumptions...First How to prove from $X$~$N(0, \sigma^2)$ to $\frac{X}{\sigma}$~ $N(0,1)$? – user111548 May 17 '14 at 19:39
  • You shouldn't even be in a statistics class if you don't know how to show that, no offense. You don't know how to find a z-score? - standardize – afedder May 17 '14 at 19:49
  • Thanks! You have $F_Z (z) = F_X (\frac{z}{a})$ in your post. My second question is what do Z and X here represent as I know X is random variable does it follow chi-squared distribution or normal distribution? My guess is Z is normal and X is chi-squared, right? – user111548 May 17 '14 at 19:52
  • No, not at all. My hint was for a general $X$ and thus a general $Z$ (since $Z$ is a multiple of $X$ in this context and $X$ is general). This applies to ANY (continuous) random variable. You need to figure out where in your solution to apply this, but I am helping you by saying that you will apply it (I also gave you another hint when I was talking about the only case we need to check and why). – afedder May 17 '14 at 20:00
  • My biggest hint so far was when I was explaining to @DilipSarwate my intentions at the beginning of the comments of this answer (the third comment). – afedder May 17 '14 at 20:03
  • c U ~ $X_2^2$ should still be a chi-squared distribution with 2 degree of freedom right as multliply a constant should not change the distribution and degree of freedom, right? – user111548 May 17 '14 at 20:12
  • incorrect...look at the last hint to figure out what multiplying a random variable by a constant does to its density, in general – afedder May 17 '14 at 20:13
  • you mean your final hint in the post? – user111548 May 17 '14 at 20:15
  • technically, the dialogue right before "final hint" – afedder May 17 '14 at 20:15
  • The only thing I pick up that I think is relevant is Z = aX, but as you say before Z is not normal distributed and X is not chi-squared distributed . So I assume they are the same distributions? If not, what could they be? – user111548 May 17 '14 at 20:18
  • I can't really answer any more of your questions without completely giving away the answer to you...the hints I have supplied are not literal random variables (unless otherwise specified, like in the first hint), they are general random variables and you can use any random variable in place of them and the equations will be satisfied... – afedder May 17 '14 at 20:22
  • This is the last thing I will say. In the application of this specific hint, let $X=\frac{1}{\sigma^2}(X_1^2 + X_2^2)$ and $a= \sigma^2$, so that $Z=aX$. What is $Z$? (This substitution is a big hint.) – afedder May 17 '14 at 20:40
  • You already know the distribution of $X$ from our previous discussions. – afedder May 17 '14 at 20:42
  • By the way, when you say "the only thing that I pick up that is relevant is $Z=aX$", this is very disappointing indeed after this long, drawn out discussion...it is all relevant, every single thing I said – afedder May 17 '14 at 20:59
1

You have $$ X_{i}^{2}=\sigma^{2}(Z^{2})=\sigma^{2}\Gamma(\frac{1}{2},2)=\Gamma(\frac{1}{2},2\sigma^{2}) $$

Therefore we have $$ X_{1}^{2}+X_{2}^{2}=\Gamma(1,2\sigma^{2}) $$

where we used property of $\Gamma$-distribution.

Bombyx mori
  • 19,638
  • 6
  • 52
  • 112
  • Remember, $\Gamma(\alpha,\lambda)$ is special for $\alpha=1$. – afedder May 17 '14 at 12:43
  • Also, this isn't a formal proof whatsoever. – afedder May 17 '14 at 12:46
  • No, but I believe it should be readable to OP and he/she should be able to do it himself. I have no intention to fill in detail for his/her homework. – Bombyx mori May 17 '14 at 12:59
  • 1
    Also, most intro probability classes don't teach the properties of the gamma distribution to the extent that you are using here. - the most important property for this is the constant multiple of a chi-squared random variable of $n$ degrees of freedom (or $\Gamma(\frac{n}{2},\frac{1}{2})$ random variable), which you seem to use as if it is a trivial leap. – afedder May 17 '14 at 13:14
  • 2nd and 3rd equalities (and even 1st in terms of notation) would be very confusing to me at the level of the OP – afedder May 17 '14 at 13:19
  • @afedder: I assume if his/her professor has assigned this problem, then him/her should be able to follow my derivation. You are right that these are usually not stated explicitly in intro level textbook. Thanks for the comment. – Bombyx mori May 17 '14 at 13:29
0

Not really - recall that the mgf is $$ m_X(t) = \mathbb{E}\left[e^{tX}\right] $$ and if you rescale $X$ by a constant $\sigma$, what happens to the result?

gt6989b
  • 54,422
0

You can approach it like this:

1) Calculate the distribution of $X_1^2$ and $X_2^2$, individually. Call them $f(X_1)$, and $g(X_2)$.

2) Now the second step. Now you calculate the distribution of the sum of $f(X_1)$ and $g(X_2)$ via convolution integral.

This can be one way of calculating what you are asking!

Note:

To calculate the distribution of $X_1^2$ (e.g.), you can use the CDF method. For example $F_{X_1^2}(x)=P(X_1^2 \le x)$. Now you express it in terms of $X_1$, and then differentiate to get the PDF.

afedder
  • 2,102
  • What a perfectly dreadful way of finding the answer where no advantage is being taken of the fact that $X_1$ and $X_2$ are zero-mean independent normal random variables with the same variance. This approach would "work" in general for arbitrary independent random variables but why bother trying to touch your ear after wrapping your arm around your head two times? – Dilip Sarwate May 14 '14 at 23:04
  • I would be interested to learn if you have something better to add. – user3001408 May 14 '14 at 23:09
  • Sure. Look at this answer to the question which I have already pointed (see my comment on the main question) as a duplicate of this one. – Dilip Sarwate May 14 '14 at 23:22
  • @dilip Yes, it is good! I agree convolution integral is terrible when you have to add, say 5, or 50 independent random variables. But convolution integral is somewhat intuitive to start with and understand what is going on. In addition, it has a nice application if you can connect with Fourier transform!!! – user3001408 May 15 '14 at 16:02
0

$\newcommand{\+}{^{\dagger}} \newcommand{\angles}[1]{\left\langle\, #1 \,\right\rangle} \newcommand{\braces}[1]{\left\lbrace\, #1 \,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\, #1 \,\right\rbrack} \newcommand{\ceil}[1]{\,\left\lceil\, #1 \,\right\rceil\,} \newcommand{\dd}{{\rm d}} \newcommand{\down}{\downarrow} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,{\rm e}^{#1}\,} \newcommand{\fermi}{\,{\rm f}} \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,} \newcommand{\half}{{1 \over 2}} \newcommand{\ic}{{\rm i}} \newcommand{\iff}{\Longleftrightarrow} \newcommand{\imp}{\Longrightarrow} \newcommand{\isdiv}{\,\left.\right\vert\,} \newcommand{\ket}[1]{\left\vert #1\right\rangle} \newcommand{\ol}[1]{\overline{#1}} \newcommand{\pars}[1]{\left(\, #1 \,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}} \newcommand{\root}[2][]{\,\sqrt[#1]{\vphantom{\large A}\,#2\,}\,} \newcommand{\sech}{\,{\rm sech}} \newcommand{\sgn}{\,{\rm sgn}} \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}} \newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert} \newcommand{\wt}[1]{\widetilde{#1}}$ Let's $\ds{X \equiv X_{1}^{2} + X_{2}^{2}}$:

\begin{align} &\color{#00f}{\large\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} {\expo{-X_{1}^{2}/\pars{2\sigma^{2}}} \over \root{2\pi}\sigma} \,{\expo{-X_{2}^{2}/\pars{2\sigma^{2}}} \over \root{2\pi}\sigma} \delta\pars{X - X_{1}^{2} - X_{2}^{2}}\,\dd X_{1}\,\dd X_{2}} \\[3mm]&={1 \over 2\pi\sigma^{2}}\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} \expo{-\pars{X_{1}^{2} + X_{2}^{2}}/\pars{2\sigma^{2}}} \delta\pars{X - X_{1}^{2} - X_{2}^{2}}\,\dd X_{1}\,\dd X_{2} \\[3mm]&=\Theta\pars{X}\,{\expo{-X/\pars{2\sigma^{2}}} \over 2\pi\sigma^{2}}\times \\[3mm]&\int_{-\infty}^{\infty}\int_{-\infty}^{\infty} \!\!\!\!\!\Theta\pars{\root{X} - \verts{X_{1}}}\bracks{% {\delta\pars{X_{2} + \root{X - X_{1}^{2}}} \over 2\verts{X_{2}}} +{\delta\pars{X_{2} - \root{X - X_{1}^{2}}} \over 2\verts{X_{2}}}} \,\dd X_{1}\,\dd X_{2} \\[3mm]&=\Theta\pars{X}\,{\expo{-X/\pars{2\sigma^{2}}} \over 2\pi\sigma^{2}} \int_{-\root{X}}^{\root{X}}{\dd X_{1} \over \root{X - X_{1}^{2}}} =\Theta\pars{X}\, {\expo{-X/\pars{2\sigma^{2}}} \over 2\pi\sigma^{2}}\bracks{2\arcsin\pars{1}} \\[3mm]&=\color{#00f}{\large\Theta\pars{X}\, {\expo{-X/\pars{2\sigma^{2}}} \over 2\sigma^{2}}} \end{align}

$\ds{\Theta\pars{x}}$ is the Heaviside Step Function. $\ds{\delta\pars{x}}$ is the Dirac Delta Function.

Felix Marin
  • 89,464
  • What is $\Theta(X)$? – Dilip Sarwate May 15 '14 at 03:32
  • @DilipSarwate I added two links at the answer end. I guess we are not allowed to put more than one link in a comment. Thanks a lot. – Felix Marin May 15 '14 at 03:56
  • Thanks for clarifying the notation. You might want to take a look at this answer of mine to a question which is a duplicate of this question. – Dilip Sarwate May 15 '14 at 04:14
  • 1
    This is like killing a kitten with a bazooka, in my opinion. – afedder May 15 '14 at 06:26
  • 1
    @afedder It's likely true but it's a nice technique in more complicated situations. It avoids to think about limits since those functions take care of that. For me it's quite useful since I'm pretty bad with drawing. It can save your life when the real messy shows up. I saw other solutions somewhere with inequalities and a derivative to the end. The $\Theta$ can handle the inequalities and the derivative yields the $\delta$ such that we avoid one step and "the drawing". We have a saying in spanish for your idea ( translated ): "To kill a cockroach with a machine gun". Thanks a lot. – Felix Marin May 15 '14 at 06:51