I am trying to calculate the Jacobian determinate of the Fourier transform which I stumbled upon when studying the Path Integral in Quantum Field Theory. I know the answer should be $1$ but I don't know how to show it. The transform is \begin{equation} x _n = \sum _k \frac{1}{ \sqrt{ N} }e ^{ - i 2 \pi k n/N } \tilde{x} _k \end{equation} I know that the Jacobian matrix is given by \begin{equation} J _{ n,k} = \frac{ dx _n }{ d\tilde{x} _k } = \frac{1}{ \sqrt{ N}} e ^{ - i 2 \pi k n/N} \end{equation} and the determinate is then \begin{equation} \det J = \frac{1}{ \sqrt{ N}}\epsilon _{ k _1 k _2 ... } e ^{ - i 2 \pi k _1 1/N } e ^{ - i 2 \pi k _1 2 /N} ... \end{equation} but I'm not sure how to show this is equal to one. I found a link online which says to calulate the determinate you should perform the Fourier Transform twice but I wasn't able to figure out the steps.
-
There might be another idea. Fourier transfrom is an isometry (up to a constant factor depending on the conventions used in the definition of FT), therefore it's sort of orthogonal transform, and hence its determinant is $\pm 1$. – TZakrevskiy Jan 30 '14 at 15:26
1 Answers
Let $Q$ be an orthonormal matrix. By definition, orthonormal matrices are the matrices which satisfy the following conditions:
($i$) $Q^{-1}=Q^T$. From here it follows that $QQ^T=QQ^{-1}=I$ where $I$ is the identity matrix, a square matrix with $1$s at the diagonal and else $0$.
($ii$) All rows and columns of an orthonormal matrix satisfy the inner product rule $<q_i,q_j>=0$ and $<q_i,q_i>=1$.
A matrix is called orthogonal (but not necessarily orthonomal) when $<q_i,q_j>=0$ holds. This means the angle beween two pairs, $i$ and $j$ of vectors from $Q$ with $i\neq j$ is $90$ degrees. This follows simply from $$\cos(\theta)=\frac{<u_i,u_j>}{||u_i||||u_j||}$$ here $||u_i||=<u_i,u_i>$ is the inner product in Euclidean space ($L^2$ norm).
Absolute value of the determinant of every orthonormal matrix is always $1$. This can be proven at least in two different ways:
$1$ st way: $$1=\det(I)=\det(QQ^T)=\det(Q)\det(Q^T)=(\det(Q))^2$$ The third equality is a result of the determinant of two square matrices here the 4th property.
$2$ nd way:
($i$) For every orthonormal matrix, $Q$, all singular values $\sigma_i$ of this matrix are equal to $1$
($ii$) The determinant of any real matrix is given by $|\det(A)|=\prod_i \sigma_i^2$
from ($i$) and ($ii$) we conclude that $|\det(A)|=(\prod_i 1)^2\Longrightarrow |\det(A)|=1$.
The proof of ($i$):
For every real matrix $Q$ we have the singular value decomposition given by $Q=U\Sigma V^T$ where $U$ and $V$ are orthonormal matrices. See here. One can select $U=Q$ and $V=I$. From here we get $\Sigma=I$. Since $\Sigma$ is uniquely determined for singular value decomposition, the proof is complete. Note that $U$ and $V$ are not unique.
The proof of ($ii$):
For this proof one can see either this, proposition C.3.7 or this question and the following answers.
To show that $J_{n,k}$ forms an orthonormal basis is easy to justify by the given properties of orthonormal matrices. Therefore, we can eventually conclude that $|\det J|=1$.

- 7,848