0

Let $A\in \mathbb R^{n\times n}$ have a right inverse $B\in \mathbb R^{n\times n}$, i.e. $AB=E_{n\times n}$. Is it possible to see only using the definition of the matrix product that $BA=E_{n\times n}$?

Please do not tell me anything about the rank or endomorphisms of finite-dimensional vector spaces or even the Gauss algorithm. Only the definition of $[AB]_{j,\ell} = \sum\limits_{k=1}^n a_{j,k} b_{k,\ell}$ (which, by assumption is $1$ for $j=\ell$ and $0$ else) is allowed.

Jochen
  • 12,254
  • I assume basic algebra isn't allowed either? As in: let $B$ be right inverse of $A$ and $C$ be left inverse. Then $$B = IB = (CA)B = C(AB) = CI = C$$ Thus $B$ is also the left inverse of $A$. –  Nov 22 '16 at 15:34
  • @Bye_World that isn't sufficient for a proof; you would need to show that $A$ has a left inverse first. – Ben Grossmann Nov 22 '16 at 15:35
  • 1
    While it doesn't answer the question directly, here is a post where proofs of the desired conclusion are given (using more than just matrix multiplication, though). – Ben Grossmann Nov 22 '16 at 15:36
  • Note: $E_{n \times n}$ refers to the identity matrix. – Ben Grossmann Nov 22 '16 at 15:37
  • @Omnomnomnom I could prove that too if need be. But I wanted to ask OP if he'd consider that type of argument a satisfactory proof or not. –  Nov 22 '16 at 15:38
  • 1
  • @Bye_World Okay, associativity of the matrix product is allowed. But from where do you get a left inverse if it is not $B$? – Jochen Nov 22 '16 at 16:10
  • The question @Omnomnomnom is referring to has very informative answers but as far as I see none of them only uses matrix multiplication. – Jochen Nov 22 '16 at 16:22
  • @Jochen if we could somehow conclude that for all column vectors $x$: $Ax = 0 \implies x =0$, then from there we could conclude that $A$ has a left inverse. – Ben Grossmann Nov 22 '16 at 16:30
  • Please see my answer here. I give a totally elementary proof (which was downvoted for some cryptic reason), which, however, relies on row reduced echelon form. – Artem Nov 22 '16 at 16:52
  • @Artem Your proof (as I interpret it) relies on Gauss' algorithm which is "not allowed". – Jochen Nov 23 '16 at 08:11
  • @Jochen The fact is that any result in basic linear algebra relies on Gauss algorithm, sometimes explicitly, sometimes in some disguise (e.g., counting dimensions is nothing else as counting pivots). On the other hand this algorithm is just multiplication by invertible matrices, which is what you are allowed to do. – Artem Nov 23 '16 at 12:57

0 Answers0