4

In a linear algebra text, the following is the definition of the inverse of a matrix

An $n\times n$ matrix $A$ is invertible when there exists an $n \times n$ matrix $B$ such that $$AB = BA = I_n$$

And likewise in an abstract algebra textbook, the definition of the inverse of a group is

Given that $G$ is a group with operation $*$, for each $a \in G$, there exists an element $a^{-1}$ such that $$a*a^{-1} = a^{-1}*a = e,$$ where $e$ is the identity element in $G$. Such element is called the inverse of $a$ in $G$.

Unfortunately, the second semester of abstract algebra didn't quite finalize due to low enrollment, so I'm doing independent self-study of topics I missed in Linear Algebra (as mind preparation). Here's my question:

Is it sufficient to show that $AB = I_n \;\;\implies \;\;B = A^{-1}$ and $A = B^{-1}$? Or must you check both that $AB = I_n$ and $BA = I_n$ to completely conclude that $A = B^{-1}$ and $B = A^{-1}$?

I remember on an exam, I had to prove that for a group homomorphism $\phi: G\to H$, for any $a \in G$, $\phi(a^{-1}) = [\phi(a)]^{-1}$ which I proved by asking the reader to observe that $\phi(a)\phi(a^{-1}) = \phi(aa^{-1}) = \phi(e_G) = e_H$ which because $\phi(a)\phi(a^{-1}) = e_H$, this can only mean that $\phi(a)^{-1} = \phi(a^{-1})$ by definition of inverse. And I got full points for it, but it leaves me wondering: am I supposed to check both arrangements to create the strongest possible argument?

Decaf-Math
  • 4,522
  • http://math.stackexchange.com/questions/3852/if-ab-i-then-ba-i – mvw Feb 08 '16 at 04:52
  • For the second problem concerning the homomorphism, you already know that the inverse exits because you are in a group. Proving $\phi (a^{-1})=[\phi(a)]^{-1}$ is not equivalent to proving an inverse exists. – Oliver Jones Feb 08 '16 at 04:57
  • In general, if we are following the equivalent conditions for a nonsingular matrix, since $A$ is nonsingular, we know that $A^{-1}$ exists. I suppose a better way to ask my question is: Suppose we have two elements $A$,$B$ and $A$ has an inverse; does it suffice to say that $B$ is the inverse of $A$ if we just show one arrangement (i.e., $AB = e$) gives the identity? So what it seems from other answers, the left inverse and right inverse might not be the same which is why you must check both that $AB = e$ and $BA = e$, no? – Decaf-Math Feb 08 '16 at 05:02

4 Answers4

6

If you already know that $G$ is a group, then to prove that $a$ and $b$ are inverses, it is enough to check $ab = e$.

If $G$ is not a group or you don't yet know that $G$ is a group (for example, if you are trying to prove that $G$ is a group), then it is not enough to show $ab =e$. You also need to show that $ba = e$.

For matrices over a field, $AB = I$ automatically implies that $BA = I$ if $A$ and $B$ are square. Otherwise, if $A$ is $m \times n$ and $B$ is $n \times m$, where $m < n$, then $BA = I$ is impossible. This can be shown by an argument using ranks.

David
  • 6,306
4

For square matrices over many rings, it is indeed sufficient to check that $AB=I$ to conclude $A=B^{-1}$. It is certainly true for matrices over fields like the real or complex numbers. You can read about it in the link If $AB = I$ then $BA = I$

The class of rings for which this works is called the class of stably finite rings. It is a very broad class of rings, and my guess is that you are working with such a ring.

But in the broadest generality, if you are working in some wild monoid (like a maid ring over a non-stably finite ring) the. It comes necessary to check both $ab$ and $ba$.

rschwieb
  • 153,510
3

In any group (including groups of invertible matrices) it is sufficient to check that something is either a left inverse or a right inverse. This is because in any group, the inverse necessarily exists (by definition of something being a group) and is unique (since $ag=e$ implies $g=a^{-1}e=a^{-1}$ by multiplying on the left by $a^{-1}$, and this implies $ga=e$ by multiplying on the right by $a$).

However, in an arbitrary algebraic structure, a right inverse and a left inverse may not agree.

Ben Sheller
  • 4,085
2

To show an inverse for both a (square!) matrix and a group element, you must show that it is both a left and a right inverse. Matrix multiplication and general group operations are not commutative.

For homomorphisms sending inverses to inverses, you have more information, which you employed.