0

In this question...

Geometric interpretation of the cofactor expansion theorem

...Grigory explained (beautifully, in my opinion) why the cofactor expansion for calculating determinants worked by breaking it up into the dot product of the vector $\vec{u}$ and the product $\vec{v} \otimes \vec{w}$.

However, I still don't understand the equation for $\vec{v} \otimes \vec{w}.$

Why should... $$\left|\begin{matrix}{1}&{0}&{0}\\v_1&v_2&v_3\\w_1&w_2&w_3\end{matrix}\right| \vec{e_1} - \left|\begin{matrix}{0}&{1}&{0}\\v_1&v_2&v_3\\w_1&w_2&w_3\end{matrix}\right| \vec{e_2} + \left|\begin{matrix}{0}&{0}&{1}\\v_1&v_2&v_3\\w_1&w_2&w_3\end{matrix}\right| \vec{e_3}$$ ...or alternatively, $$\left|\begin{matrix}\overrightarrow{e_1}&\overrightarrow{e_2}&\overrightarrow{e_3}\\v_1&v_2&v_3\\w_1&w_2&w_3\end{matrix}\right|$$ ...give us a vector orthogonal to $\vec{v}, \vec{w}$ but whose magnitude is equal to the area of the parallelogram they create?

How come we can add the vectors in such a way? What does Grigory mean by "linearity"?

Thanks!

joshuaronis
  • 1,479
  • 10
  • 29

1 Answers1

1

One way of defining the cross product is the unique vector $a \times b$ that satisfies $\langle a \times b, x \rangle = \det \begin{bmatrix} a & b & x\end{bmatrix}$ for all $x$.

Then it is clear that $a \bot (a \times b)$ and similarly for $b$. Furthermore, $\|a \times b \|^2 = \det \begin{bmatrix} a & b & a \times b\end{bmatrix}$, or $\|a \times b \| = \det \begin{bmatrix} a & b & {a \times b \over \|a \times b \|}\end{bmatrix}$, and the latter is the area of the parallogram created by $a,b$.

Note: Observe that $\langle a \times b, x \rangle = \sum_k x_k \langle a \times b, e_k \rangle $, so you can 'recover' the components of $a \times b$ using this formula (by choosing $x=e_1,e_2,e_3$). In particular, this gives $a \times b = \langle a \times b, e_1 \rangle e_1 + \langle a \times b, e_2 \rangle e_2 + \langle a \times b, e_3 \rangle e_3$

copper.hat
  • 172,524
  • copper.hat thank you! 3Blue1Brown actually has a video that explains that really nicely! The problem is that understanding the above involves understanding how to compute determinants using the cofactor theorem, and understanding how to compute determinants is being explained (everywhere I've seen, including the explanation I linked to) using the dot product with a cross product. I'm going in circles! – joshuaronis Jul 05 '19 at 20:06
  • @JoshuaRonis: I'm not sure it helps, but I have expanded slightly above. Note that $\det M = \det M^T$ so it is immaterial whether you use the row or column version. – copper.hat Jul 05 '19 at 20:44
  • Got you! Just to tie up what you're saying for me to refer back to in the future: Let $\vec{h}$ be the vector such that $\forall \vec{x}: (\vec{h} \bullet \vec{x} = det|\vec{a}, \vec{b}, \vec{x}|)$. Through copper.hat's and 3Blue1Brown's explanations, that means that $\left\Vert \vec{h} \right\Vert = (\vec{a}, \vec{b})$, where $(\vec{a}, \vec{b})$ denotes the area of the parallelogram created by $\vec{a}$ & $\vec{b}$. Additionally, $\vec{h}$ must be orthogonal to that parallelogram. That's the first part. Now, for the second part, note that: $(\vec{h} \bullet \hat{e_1})(\hat{e_1})$ is ... – joshuaronis Jul 06 '19 at 21:22
  • ...the $\hat{e_1}$ component of $\vec{h}$, and so on for the other two basis vectors. That means we can "recover" $\vec{h}$ by writing it as a linear combo of its three components: $\vec{h} = (\vec{h} \bullet \hat{e_1})(\hat{e_1}) + (\vec{h} \bullet \hat{e_2})(\hat{e_2}) + (\vec{h} \bullet \hat{e_3})(\hat{e_3})$. But, from our first rule about $\vec{h}$, that can be rewritten as $\vec{h} = (det|\vec{a}, \vec{b}, \hat{e_1}|)(\hat{e_1}) +(det|\vec{a}, \vec{b}, \hat{e_2}|)(\hat{e_2}) +(det|\vec{a}, \vec{b}, \hat{e_3}|)(\hat{e_3})(\hat{e_3})$, and from there arises the cofactor expansion! – joshuaronis Jul 06 '19 at 21:24
  • @copperhat I just wish there was some way we could animate and "see" this...like now that I see it mathematically, I still can't visualize the three components adding up to the final vector at all...I wish someone used Manim or some other graphics library to create a really nice animation for the cofactor theorem, both for finding the cross product and for finding the 3D determinant. :( – joshuaronis Jul 06 '19 at 21:27
  • And just one more thing, the reason the cofactor expansion arises from that last thing is because of the determinants: $$(det|\vec{a}, \vec{b}, \hat{e_1}|) = \begin{vmatrix} a_1 &b_1 &1 \ a_2&b_2 &0 \ a_3&b_3 &0 \end{vmatrix}$$, in which case obviously $a_1, b_1$ don't matter, which means we're calculating the determinants of the "smaller matrices", aka the cofactor expansion!!! – joshuaronis Jul 06 '19 at 21:39
  • @copperhat to clean all this up a bit, is it okay if I add it to your answer? Oh, and also lol, I forgot to say that $\vec{h} = \vec{a} \otimes \vec{b}$, which was like the most important thing to say!!! :) – joshuaronis Jul 06 '19 at 21:40
  • @JoshuaRonis: I glossed over how you get the paralleogram area. It is straightforward, but tedious. Pick a basis as follows, choose ${a\over |a|}$, then apply one step of Gram Schmidt to $b$, and take ${a \times b \over |a \times b|}$ as the last vector. Then it is fairly straightforward to show that the determinant is the area of the parallogram whose edges are $a,b$. – copper.hat Jul 07 '19 at 23:02