Ok, so the Chi-Squared distribution with $n$ degrees of freedom is the sum of the squares of $n$ independent Gaussian random variables.
The trouble is, my Gaussian random variables are not independent. They do however all have zero mean and the same variance. Supposing I have a covariance matrix---which again is not a diagonal matrix because they aren't independent, but all the elements along the diagonal are equal to each other because they have the same variance, and in fact the covariance matrix is a symmetric Toeplitz matrix (and I'm not saying that this is important to the solution if there is one, but if it's a necessary property to get anywhere, by all means use that fact)---is there some way to decompose this sum of squares of these Gaussian random variables into perhaps a sum of chi-squared random variables and possibly Gaussian random variables? In other words, I can't directly just square them all and add them together and call it a chi squared distribution because a chi squared distribution is a sum of independent Gaussian squares, and they aren't independent.
I know how to find a linear transformation of the Gaussian random variables which are $n$ independent Gaussians, but that's no help because they aren't the things being squared, you see.