2

I just learned about the Kronecker Delta function and $\epsilon_{ijk}$ for the first time, and I still can't wrap my mind around how to prove that $$\epsilon_{ijk} = det\begin{pmatrix} \delta_{i1} & \delta_{i2} & \delta_{i3} \\ \delta_{j1} & \delta_{j2} & \delta_{j3} \\ \delta_{k1} & \delta_{k2} & \delta_{k3} \end{pmatrix}$$ I understand that the RHS is $$det\begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}=1$$ but how is that related to the fact that the $\epsilon$ tensor renders 1 for even, -1 for odd permutations of $i,j,k$ and $0$ for identical ones? And how can it be proven?

The only "proof" I could think of is$$ \delta_{i1}\delta_{j2}\delta_{k3} + \delta_{i2}\delta_{j3}\delta_{k1} + \delta_{i3}\delta_{j1}\delta_{k2} - \delta_{i3}\delta_{j2}\delta_{k1} - \delta_{i2}\delta_{j1}\delta_{k3} - \delta_{i1}\delta_{j3}\delta_{k2} \\ = \begin{cases} 1 \cdot \delta_{ijk} & |\{ijk\} \in \{123\},\{312\},\{231\} \\ -1 \cdot \delta_{ijk} & |\{ijk\} \in \{213\},\{321\},\{132\} \end{cases} \\ = \epsilon_{ijk} $$ but that doesn't look sound to me.

I am aware of this question, but it's kind of like two steps ahead when I'd like to be able to explain just the first.

Can somebody explain with an Einstein mindset?

cirko
  • 121
  • I don't know if I get you answer correcty, but it seems to me that the order of $i,j,k$ in $\delta_{ijk}$ determines the order associated to the entries of the associate matrix. If you, for instance, choose $\delta_{jik}$, then you should interchange the first and second row of the matrix, which will lead to a change in the sign of the associate determinant. Did I address your question? – IamWill May 12 '20 at 01:19
  • I kinda get what you're getting at and sense it might be the "proof" I've been looking for, if only it could somehow be written in equations/matrix form similar to my example above – cirko May 12 '20 at 19:50
  • I will elaborate an answer! – IamWill May 12 '20 at 20:07

3 Answers3

2

A property of the determinant is:

  • Exchanging two rows while leaving everything else unchanged changes the sign of the determinant, but not the magnitude.

This can be proven from the formula $\det(AB) = (\det A)(\det B)$. The operation of exchanging rows in a matrix $M$ is the same as taking $EM$, where $E$ is the same row exchange on the identity matrix $I$. One can easily show all such matrices $E$ have determinant $-1$.

Now note that if two rows of a matrix $M$ are identical, then exchanging them makes no change to the matrix. Therefore $\det M = -\det M$, from which it follows that $\det M = 0$:

  • If a matrix has two identical rows, then its determinant is $0$.

Now define $$d_{ijk} = \det \begin{pmatrix} \delta_{i1} & \delta_{i2} & \delta_{i3} \\ \delta_{j1} & \delta_{j2} & \delta_{j3} \\ \delta_{k1} & \delta_{k2} & \delta_{k3} \end{pmatrix}$$

The matrix for $d_{123}$ is the identity matrix, so $d_{123} = 1$. If $i = j$ or $j = k$ or $i = k$, the matrix will have two identical rows, so $d_{ijk} = 0$ in this case. And by the exchange of row rule for matrix determinants, if you exchange the values of any two of $i, j, k$, then $d_{ijk}$ changes sign.

These three facts completely determine all values of $d_{ijk}$. But $\epsilon_{ijk}$ obeys exactly the same set of rules. So the two of them must be equal.

Paul Sinclair
  • 43,643
  • OK, I understand the first property, and the second one, too (det=0). I also got how you set up the matrix for $d_{ijk}$, and its properties. But I do not see how $\epsilon_{ijk}$ obeys the same set of rules. How would I demonstrate this for $d_{ijk}$ or $d_{123}$? – cirko May 15 '20 at 11:09
  • What is your definition of $\epsilon_{ijk}$ then? Because I usually see it defined by $\epsilon_{123} = 1$ and being anti-symmetric (which, just as with the matrix argument above, also requires it to be $0$ for repeated indices). – Paul Sinclair May 15 '20 at 12:23
  • I think I got my wires crossed here, I'm only used to calculating with specific vectors and matrices, but not to abstractly thinking about their indices and how and why to swap them. I somehow thought that it might be possible for me to show by simple equations that $\epsilon$ and the kronecker delta determinant lead to the same result, not just by comparing the rules they use, as I can "merely" speak them out rather than calculating – cirko May 15 '20 at 12:54
  • There are various ways to prove it. I could have developed equations, but I think this method shows better why they are equal - because $d_{ijk}$ obeys the defining conditions of $\epsilon_{ijk}$. – Paul Sinclair May 16 '20 at 01:48
1

As we know, if $A = (a_{ij})$ is an $n\times n$ matrix, its determinant is given by: $$\mbox{det}A = \sum_{\sigma \in S_{n}}\epsilon_{\sigma}\prod_{i=1}^{n}a_{i,\sigma(i)}$$ where $S_{n}$ is the set of all permutations $\sigma$ of the set $I_{n}:=\{1,...,n\}$ and $\epsilon_{\sigma}$ is the sign of the permutation $\sigma$. In your case, we have $n=3$.

Let us consider the matrix: $$\Delta_{ijk} := \begin{pmatrix} \delta_{i1} & \delta_{i2} & \delta_{i3} \\ \delta_{j1} & \delta_{j2} & \delta_{j3} \\ \delta_{k1} & \delta_{k2} & \delta_{k3} \end{pmatrix} $$ For simplicity, let us consider only the case where $i,j,k$ are all different because, if not, it is easy to see that the result follows because at least two rows of $\Delta_{ijk}$ are equal, so that $\det \Delta_{ijk} = 0$ and also is $\epsilon_{ijk}$. Now, as you pointed out, we have: $$ \epsilon_{ijk} = \mbox{det} \Delta_{ijk} = \sum_{\sigma \in S_{3}}\epsilon_{\sigma}\prod_{i=1}^{3}\delta_{i,\sigma(i)}$$ Now, let $\eta$ be a fixed permutation of $S_{3}$. Every permutation is a bijection from $I_{3}$ to itself, so every permutation has an inverse $\eta^{-1}$. Also, it is easy to prove that $\epsilon_{\eta^{-1}} = \epsilon_{\eta}$. Now, note that: $$\epsilon_{\eta(i),\eta(j),\eta(k)} = \sum_{\sigma \in S_{3}}\epsilon_{\sigma}\prod_{i=1}^{3}\delta_{\eta(i),\sigma(i)} $$ If we set $\eta(i) = k$, then $i = \eta^{-1}(k)$ and: $$\sum_{\sigma \in S_{3}}\epsilon_{\sigma}\prod_{i=1}^{3}\delta_{\eta(i),\sigma(i)} = \sum_{\sigma \in S_{3}}\epsilon_{\sigma}\prod_{k=1}^{3}\delta_{k,(\sigma\circ \eta^{-1})(k)} $$ If we sum over every permutation $\sigma$ in $S_{3}$, the composite $\sigma \circ \eta^{-1}$ also covers every permutation of $S_{3}$ so we can redefine $\tilde{\sigma} = \sigma\circ\eta^{-1}$ and: $$\sum_{\sigma \in S_{3}}\epsilon_{\sigma}\prod_{k=1}^{3}\delta_{k,(\sigma\circ \eta^{-1})(k)} = \sum_{\tilde{\sigma} \in S_{3}}\epsilon_{\eta}\epsilon_{\tilde{\sigma}}\prod_{k=1}^{3}\delta_{k,\tilde{\sigma}(k)} =\epsilon_{\eta}\mbox{det}\Delta_{ijk}$$ where I've used: $$\epsilon_{\sigma} = \epsilon_{\tilde{\sigma}}\frac{1}{\epsilon_{\tilde{\sigma}}}\epsilon_{\sigma} = \epsilon_{\tilde{\sigma}}\frac{1}{\epsilon_{\eta}\epsilon_{\sigma}}\epsilon_{\sigma} = \epsilon_{\eta}\epsilon_{\tilde{\sigma}}.$$ The conclusion is that: $$\epsilon_{\eta(i),\eta(j),\eta(k)}= \epsilon_{\eta}\overbrace{\mbox{det}\Delta_{ijk}}^{=1} = \epsilon_{\eta}$$ Thus, $\epsilon_{\eta(i),\eta(j),\eta(k)}$ coincides with the sign of the permutation $\eta$, i.e. it is $1$ if $\eta$ is an even permutation and $-1$ otherwise.

IamWill
  • 4,025
  • I appreciate your effort, but this did not address the last sentence of my question - can you simplify your explanations so much that a person with little background knowledge can understand it, too? In other words, your proof is probably comprehensible to someone who already has grasped the topic, but is there an "easy language" version for a novice, too? – cirko May 15 '20 at 11:04
  • @cirko I get your point but I think we are stuck in two different matters. You defined the Kronecker delta $\epsilon_{ijk}$ and I'm assuming you understand how it works. It seems that your confusion is how the $\epsilon$ tensor changes signs according to the choices of $i,j,k$ in $\epsilon_{ijk}$ or, putting it in another way, how does the determinant of $\epsilon$ changes sign as the Kronecker delta changes sign. Let me continue below. – IamWill May 15 '20 at 15:52
  • The quick justification for it is the one I mentioned in my first comment to you post: an exchange of two rows in a matrix leads to a change of sign in the associate determinant. This is a known result of linear algebra and it is basically the approach in the answer of PaulSinclair. I think the only easier approach to this would be to explicitly evaluate all determinants of all associate $\epsilon$ matrix of all permutations of $i,j,k$ of $\epsilon_{ijk}$. This can be done, but you have to do a lot of tedious calculations. – IamWill May 15 '20 at 15:58
  • You are correct when you say that my approach is not the most elementar one. What I tried to do is to justify the property for determinantes I mentioned above in a rather direct way, to convince you of the result and save you some time. Do you get my point? – IamWill May 15 '20 at 16:00
  • To summarize: I can try to edit my post and try to turn it into a little more basic explanation, but I don't know if my answer would differ much from PaulSinclair's answer. – IamWill May 15 '20 at 16:02
1

Both sides are $1$ if $i=1,\,j=2,\,k=3$, for the reason you gave. Both sides also multiply by $-1$ if any two indices are exchanged. (The determinant, in particular, is of a matrix that swaps two of its rows in this process.) In the special case where the indices are equal, both expressions must be originally $0$; in the special case where all three indices are unequal, the sign change on each side preserves equality, but of nonzero values. Since $\epsilon_{ijk}$ is completely specified by full antisymmetry together with $\epsilon_{123}=1$, we're done.

J.G.
  • 115,835