4

Sometimes formulas in linear algebra are not easy to remember. Some usefulness for the process of remembering can provide application of mnemonics.

  • Do you know some useful mnemonics for this purpose?

I'll give two examples:

  • For the process of finding the inverse of matrix it could be used mnemonic Detminstra what can be translated as 1. calculate determinant 2. for every entry find minors with the sign 3. transpose obtained matrix.
  • The other example is Avvedia what is the shortcut for the formula with eigenvectors $AV=VD$. Knowing this formula we can easily obtain formula $A=VDV^{-1}$ or twin formula for diagonal matrix $D=V^{-1}AV$- sometimes $V$ and $V^{-1}$ can be erroneously interchanged - with Avvedia it is easier to check correctness of a formula.

What other mnemonics could be useful in linear algebra?

Added lately

  • Furorri: concerning existence of right inverse for full row rank matrix (analogously for full column rank matrix would be Fucorlin) - these two inverses easy to erroneously interchange.
Widawensen
  • 8,172
  • 7
    If you've internalized the fact that $A \begin{bmatrix} v_1 & \cdots & v_n \end{bmatrix} = \begin{bmatrix} A v_1 & \cdots & A v_n \end{bmatrix}$, then the formula $AV = VD$ is easy to remember because it tells us that $A v_i = d_i v_i$ for all $i$. Gilbert Strang emphasizes various useful ways of looking at matrix multiplication in his linear algebra books. If Cramer's rule is forgotten, it can be derived quickly using the approach here: http://math.stackexchange.com/a/1941606/40119 I could be wrong but I'm a bit skeptical of mnemonics in math education; it seems better to focus on deriving. – littleO Mar 28 '17 at 08:46
  • @littleO However deriving is always longer than just a one word. In the case for example of a inverse of matrix it is impossible in reality to derive the formula in a quick way. I agree that mnemonics can play more important role in rather more complicated cases.... – Widawensen Mar 28 '17 at 08:52
  • Linear algebra does not have really any complicated formula that requires a device to remember. Actually a problem in mathematics is considered very simplified if one can reduce it to a linear algebra problem. Initially I found the formulas in Gram-Schmidt orthogonalization messy: but after realising that we repeatedly subtract projections to previously found partial set of orthonormal basis vectors it was easy. – P Vanchinathan Mar 28 '17 at 09:11
  • @PVanchinathan hmm, so my congratulations that you have such a level of understanding that you can easily grasp all details ... maybe practice makes a master... however sometimes we have not so much time as we should have.. – Widawensen Mar 28 '17 at 09:24
  • 5
    @Widawensen: Trying to learn mathematics by rote remembering of formulas is the slow way to go about it -- so slow that many who attempt it end up believing they are "bad at math" because they're stuck with an unfeasible approach. If you're pressed on time you should focus your efforts at understanding what you're doing, and do practice problems with a view to develop that understanding rather than just learning procedures by role. – hmakholm left over Monica Mar 28 '17 at 09:43
  • It may be true that rote learning is quicker if your specific goal is to reproduce a specific formula. However that is generally never your real goal -- your actual goal is to solve a larger problem, and an understanding-based approach will at once help you decide which procedure will be useful for that particular problem and remember how the procedure goes. – hmakholm left over Monica Mar 28 '17 at 09:47
  • One example right here: Your focus on memorizing the formula for inverting a matrix by adjugates/Cramer can easily blind you to the fact that this is not a very good method for finding inverse matrices in practice. It is much faster, say, to extend the matrix to the right by $I$ and doing Gaussian elimination. The fact that it can be done by determinants is sometimes useful to know for theoretic purposes, but in practice it is only helpful in extremely special cases. – hmakholm left over Monica Mar 28 '17 at 09:52
  • @HenningMakholm Yes, your remarks are valuable but generally I don't expect to replace learning with mnemonic formulas but rather to support learning with them, especially in the places where it is easy for a mistake. Of course for professional mathematicians such methods can seem to be unnecessary but linear algebra are now used for many users with a very different level of understanding, so maybe in some cases the mnemonic technique is useful.. and finally for different methods for inverting could be different mnemonics..why not? – Widawensen Mar 28 '17 at 11:20
  • I'm adding "ADiCol" for scaling Columns if diagonal matrix is right multiplier, and "DiArow" scaling Rows if left. – Widawensen Oct 03 '17 at 09:05
  • I'm adding Paa Tamat for projection matrix $P= A(A^TA)^{-1}A^T$ – Widawensen Oct 29 '18 at 08:29

2 Answers2

6

(Too long for a comment.)

I agree with some commenters here. Before you can build up muscle memory, it is often easier, or even faster, to derive what you need than to recall mnemonics. And derivation also makes you understand better. At least this is my own experience when linear algebra is concerned.

In recent years, the only formula that I almost need some mnemonics to help remembering is the formula for calculating the determinant of a block-$2\times2$ matrix when two adjacent sub-blocks commute. Consider $$ M=\pmatrix{A&B\\ C&D}, $$ where the four sub-blocks are square submatrices of identical sizes over some commutative ring. When some two adjacent sub-blocks of $M$ commute, we have (c.f. John Silvester, Determinants of Block Matrices) $$ \det M= \begin{cases} \det(AD-BC) & \text{ if } C,D \text{ commute},\\ \det(DA-CB) & \text{ if } A,B \text{ commute},\\ \det(DA-BC) & \text{ if } B,D \text{ commute},\\ \det(AD-CB) & \text{ if } A,C \text{ commute}. \end{cases} $$ This is analogous to the formula $\det\pmatrix{a&b\\ c&d}=ad-bc$, but care must be taken here because the orders of $A,B,C,D$ in the polynomials above (i.e. $AD-BC$ etc.) depend on which sub-block commutes with which.

Kind of messy, right? But if you truly understand how they are derived, you don't need any mnemonics. First, we use Gaussian elimination to eliminate the off-diagonal block among the pair of commuting sub-blocks. E.g. in the first case above, i.e. when $C$ and $D$ commute, we have $$ \pmatrix{A&B\\ C&D}\pmatrix{D&0\\ -C&I}=\pmatrix{AD-BC&B\\ 0&D}.\tag{1} $$ Take determinants on both sides, we get $\det(M)\det(D)=\det(AD-BC)\det(D)$. Cancelling out $\det(D)$, we get the result.

At this point, the derivation still looks tedious. However, note that in our derivation, the second block column of $(1)$ does not really matter to our end result. So, to find the right polynomial we need, all we only need to calculate $$ \pmatrix{A&B\\ C&D}\pmatrix{D\\ -C}. $$ In other words, when we have a row of commuting sub-blocks, we use a block column vector to kill off the off-diagonal commuting block ($C$ in this example), and the only thing that you need to memorise is the following:

It is the off-diagonal commuting sub-block that has a negative sign in the killer block vector.

With this in mind, it is now dead easy to see what polynomial to use in each of the above four cases: $$ \begin{cases} \pmatrix{A&B\\ C&D}\pmatrix{D\\ -C}=\pmatrix{AD-BC\\ 0} & \text{ if } C,D \text{ commute},\\ \\ \pmatrix{A&B\\ C&D}\pmatrix{-B\\ A}=\pmatrix{0\\ DA-CB} & \text{ if } A,B \text{ commute},\\ \\ \pmatrix{D&-B}\pmatrix{A&B\\ C&D}=\pmatrix{DA-BC&0} & \text{ if } B,D \text{ commute},\\ \\ \pmatrix{-C&A}\pmatrix{A&B\\ C&D}=\pmatrix{0&AD-CB} & \text{ if } A,C \text{ commute}. \end{cases} $$

user1551
  • 139,064
  • 1
    You have found convincing example where understanding is better than some abstract mnemonic and in this case is even hard to find a mnemonic for the mentioned statement to memorize: .... offDiaComBloNS? – Widawensen Mar 28 '17 at 11:35
  • @Widawensen In a brief period of time, I did use the phrase "column first, row second" to memorise the ordering. If the two commuting blocks appear on a column, put them first in the multiplicands; if they lie on a row, put them after the non-commuting ones. That did help me to decide the ordering quickly, but I couldn't get rid of the worry that if I had remembered the formula wrongly. In contrast, thinking in terms of Gaussian elimination is much more reassuring. – user1551 Mar 28 '17 at 11:49
0

I made a mnemotechnic (quick and dirty attached in the image) for remembering the dimensions of the iMage space & Null space and it's relation to matrix vector multiplication of an M x N matrix. I plan on making a more beautiful LaTeX version when I find the time.

Mnemotechnic for dimensions and null space image space (first post so link instead of image...)