In a linear algebra text, the following is the definition of the inverse of a matrix
An $n\times n$ matrix $A$ is invertible when there exists an $n \times n$ matrix $B$ such that $$AB = BA = I_n$$
And likewise in an abstract algebra textbook, the definition of the inverse of a group is
Given that $G$ is a group with operation $*$, for each $a \in G$, there exists an element $a^{-1}$ such that $$a*a^{-1} = a^{-1}*a = e,$$ where $e$ is the identity element in $G$. Such element is called the inverse of $a$ in $G$.
Unfortunately, the second semester of abstract algebra didn't quite finalize due to low enrollment, so I'm doing independent self-study of topics I missed in Linear Algebra (as mind preparation). Here's my question:
Is it sufficient to show that $AB = I_n \;\;\implies \;\;B = A^{-1}$ and $A = B^{-1}$? Or must you check both that $AB = I_n$ and $BA = I_n$ to completely conclude that $A = B^{-1}$ and $B = A^{-1}$?
I remember on an exam, I had to prove that for a group homomorphism $\phi: G\to H$, for any $a \in G$, $\phi(a^{-1}) = [\phi(a)]^{-1}$ which I proved by asking the reader to observe that $\phi(a)\phi(a^{-1}) = \phi(aa^{-1}) = \phi(e_G) = e_H$ which because $\phi(a)\phi(a^{-1}) = e_H$, this can only mean that $\phi(a)^{-1} = \phi(a^{-1})$ by definition of inverse. And I got full points for it, but it leaves me wondering: am I supposed to check both arrangements to create the strongest possible argument?