8

I am a high school student and my formal Linear Algebra education consisted merely of the definition of matrices as a list of numbers and then some random properties. While reading the BetterExplained article on Linear Algebra and some of 3Blue1Brown's videos on the same did help, I still often face difficulties while solving problems and when I look up the solutions, all I'm thinking is "Wait. You're allowed to do that to matrices too?"

So my question is - what can we do to these list of numbers that we can do to individual real and complex numbers?

To clarify, I do know that we can add matrices by adding their individual elements together and multiply matrices in a row-column order and stuff like that. My doubts are along the lines of the ones listed below:

  1. Does $AB = CD$ imply that $B^{-1}A^{-1}=D^{-1}C^{-1}$ ? (where A, B, C and D are 4 non-singular matrices of appropriate orders)
  2. Is $A \times A^n = A^n \times A$ valid? (where A is a matrix and n is a natural number)
  3. Is multiplication of matrices commutative only when a matrix is being multiplied by a null matrix or unit matrix of appropriate order?
  4. Is the inverse of the matrix $A^n$ the same as the inverse of $A$ multiplied $n$ times to itself?

I am not asking for proofs of the problems listed above - instead, I would greatly appreciate it if someone who has noticed some "patterns" in the doubts listed above would point me to some resource that I can study to clear all such doubts in my conceptual understanding. Alternatively, how should I approach matrix multiplication and their inverses in general that would solve these and other similar problems that I may be having?

Thank you.


Edit: These issues have been solved adequately and then some by Dave L. Renfro's comments on this question.

sonicsid
  • 109
  • @DietrichBurde do you have such a text in mind? I have indeed read my high school math textbook and another book focused on problem solving but these did not prove to be enough. Thanks. – sonicsid Sep 09 '21 at 10:42
  • 1
    Concerning your second question, can't you see that both $A^n.A$ and $A.A^n$ are equal to$$\overbrace{A.A\cdots A}^{n+1\text{ times}}?$$ – José Carlos Santos Sep 09 '21 at 10:44
  • 1
    Yes, I have many texts in mind. This site has a huge amount of book recommendations for linear algebra, see for example here. Also useful is the term group, in your case $GL_n(K)$. In a group always $a^n\cdot a=a\cdot a^n$ and $(a^n)^{-1}=(a^{-1})^n$ and so on. – Dietrich Burde Sep 09 '21 at 10:44
  • For 2 and 4, know that matrix multiplication is associative. So, for example, $(A^{-1}A^{-1})(AA)=A^{-1}(A^{-1}A)A=A^{-1}(I)A=A^{-1}(IA)=A^{-1}A=I$ – Joe Sep 09 '21 at 10:46
  • 1
    @JoséCarlosSantos Indeed. I had doubts about whether pre-multiplying and post-multiplying would yield the same result or not. – sonicsid Sep 09 '21 at 10:46
  • @DietrichBurde thank you, I will definitely look up more about groups. Having weak foundations as the ones provided by my formal education is indeed frustrating. Could you think of some specific text on linear algebra that would be helpful at my novice level? – sonicsid Sep 09 '21 at 10:48
  • 1
    Yes, see the link above. But in principle it is the same situation as for learning English. Which grammar or vocabulary book should I take? I doesn't matter too much, you just have to start doing things yourself. Do also more examples with matrices yourself. – Dietrich Burde Sep 09 '21 at 10:52
  • @DietrichBurde I understand. Thanks a lot! – sonicsid Sep 09 '21 at 10:54
  • 1
    Keith Mathews has a very nice on-line linear algebra text, and maybe the chapter on matrices will help (e.g. see p. 31). Off-hand, I can think of two subtle issues that sometimes get overlooked (by accident) or ignored (on purpose, for pedagogical reasons) in treatments for beginners. (1) How do we know we can multiply both sides of an equation on the left (or on the right), take inverses of both sides of an equation, and similar things? (continued) – Dave L. Renfro Sep 09 '21 at 12:14
  • 1
    More generally, what exactly are the possibilities for what we can do to both sides of an equation? For this, see In a group $G$ with operation $\star$, can I apply $\star$ to both sides of an equation?, including my comment to amWhy's answer. (2) The usual associativity rule for a binary operation, namely $a(bc) = (ab)c$ (here the operation is denoted by juxtaposition of letters, which represent arbitrarily chosen elements on which the operation applies) allows unlimited higher-order associativity. (continued) – Dave L. Renfro Sep 09 '21 at 12:28
  • 1
    For example, in the case of an ordered sequence of $4$ elements there are $5$ different ways -- $[(ab)c]d$ and $(ab)(cd)$ and $[a(bc)]d$ and $a[(bc)d]$ and $a[b(cd)$ -- to evaluate their "product" (instead of just $2$ ways in the case of $3$ elements), and all $5$ ways can be proved to result in the same element by using only the assumption that the $2$ ways for $3$ elements give the same element. For more about these kinds of issues, see this 12 September 2006 sci.math post. – Dave L. Renfro Sep 09 '21 at 12:34
  • @DaveL.Renfro You identified those subtle issues extremely, extremely, and I can not emphasise this enough, extremely well. Thank you. The post you linked in your second comment fixed Issue 1 for me, and your third comment cleared many of my doubts. I'll definitely be reading that chapter on matrices and that sci.math post soon to fix other gaps in my knowledge that I can not quite put my finger upon at the moment. Thank you once again! – sonicsid Sep 09 '21 at 14:31
  • 1
    Thanks! Incidentally, I noticed that in some copying and pasting from that 2006 post I omitted a ']' bracket. The last of the 5 ways I gave for the product of 4 elements (in a given order) should be $a[b(cd)].$ For practice you might want to try to prove two (or more) of these are equal by only making use of ordinary "$x(yz) = (xy)z$" associativity. (I used different letters to avoid confusion with what you can assume and what you're to prove.) – Dave L. Renfro Sep 09 '21 at 18:16
  • 1
    @sonicsid With matrix multiplication, it may help to remember how it came about in the first place: $m \times n$ matrices of real numbers represent linear transformations from $\mathbb{R}^n$ to $\mathbb{R}^m$ and the product of matrices is defined precisely so that the product of matrix representations of two linear maps is the matrix representation of the composition of the linear maps. In symbols, if $S, T$ are linear transformations, and $M$ gives their matrix representations, then $M(S)M(T) = M(S \circ T)$. – Mason Sep 09 '21 at 20:00
  • (next day) In case anyone is interested, here's one way to carry out my last comment. Letting $x=ab$ and $y=c$ and $z=d,$ then $x(yz)=(xy)z$ gives $(ab)(cd)=[(ab)c]d.$ Letting $x=a$ and $y=b$ and $z=cd,$ then $x(yz)=(xy)z$ gives $a[b(cd)]=(ab)(cd).$ This shows 3 of the 5 expressions are equal. For the other two expressions, use $(ab)c=a(bc)$ to get $[(ab)c]d=[a(bc)]d,$ and use $b(cd)=(bc)d$ to get $a[b(cd)]=a[(bc)d].$ – Dave L. Renfro Sep 10 '21 at 10:19