1

I have proved that if a square $n$ x $n$ matrix $A$ has a right and left inverse, then these are equal and form an inverse matrix of $A$.

However I'm interested in the following implication:

Suppose a left inverse $B$ of a square $n$ x $n$ matrix $A$ exist. Does this imply that a right inverse $C$ of $A$ exist ?

Also, if this is true - is the implication also true in the case of a right inverse $B$ ?

Thanks.

A.Γ.
  • 29,518
Shuzheng
  • 5,533
  • Yes, and yes. The easiest way to see it is probably to consider the linear maps induced by the matrices. – Daniel Fischer Nov 18 '13 at 09:26
  • At the risk of repeating other answers already given, a function has a left inverse iff it's injective; a function has a right inverse iff it's surjective; and a linear transformation between finite dimensional vector spaces of the same dimension is injective iff it's surjective. – littleO Nov 18 '13 at 10:09
  • Duplicate: https://math.stackexchange.com/q/3852/472818 – mr_e_man Jan 31 '23 at 18:55

2 Answers2

4

Yes, If $A$ has a left-inverse $B$, then $BA = I$, and so $A$ is injective (as a linear map). But it is a linear operator on a finite dimensional space, so it is also surjective, and so it has a right-inverse.

  • Hi Prahlad. I've never been thinking of matrices like this. However if $A$ was a traditional function, yes if it is surjective indeed it has a right inverse. However how do I construct the right inverse of $A$ in the matrix case ? Doing it for a traditional function I choose some element in the domain that has image equal to the input. – Shuzheng Nov 18 '13 at 16:32
  • Once you know it has a right inverse, then you can check that the right-inverse must equal the left-inverse, and so $B$ is the right-inverse! – Prahlad Vaidyanathan Nov 18 '13 at 16:34
  • But you have just proved that it is a matrix. In fact, you have proved that it is $B$! – Prahlad Vaidyanathan Nov 18 '13 at 16:43
  • So I can say $f: R^{n \cdot n} \rightarrow R^{n \cdot n}$ and then $BX = B(Af(X)) = (BA)f(X) = (I)f(X) = f(X)$, where $R^{n \cdot n}$ denote the set of $n$ x $n$ matrices and $X \in R^{n \cdot n}$. The value of the theoretical right inverse $f(X)$ is the same as the the matrix product $BX$ for every $X \in R^{n \cdot n}$. Hence the linear map $B$ and $f$ are equal and can be substituted for one another. What I proved before needed some work to prove that $f$ is in fact $B$ - This is what I've done now ? Also are there other ways of proving this ? If my proof is right, please tell me. – Shuzheng Nov 18 '13 at 17:17
0

Suppose that the matrix $A$ has a left inverse $B$, such that $BA=I$. The determinant of a matrix product is the product of the factors' determinants, and so $\left\lvert BA \right\rvert=\left\lvert B \right\rvert \left\lvert A \right\rvert=\left\lvert I \right\rvert=1$. Thus,both the inverse and the original matrix must be non-singular (in the sense of having non-null determinant). Since $A$ is non-singular, we can build $$D=\frac{1}{\left\lvert A \right\rvert}A^{adj}$$ where $A^{adj}$ is the adjoint (transposed) matrix or cofactors of $A$. It's not difficult to check that $A D=I$. So, $A$ has a right inverse given by $D$.

Now, assume $A$ has a right inverse $R$, so that $AR=I$. Again, by properties of the determinant of a matrix product, it must be the case that both $A$ and $R$ are non-singular. Thus, we can find the right-inverse $R^{-1}$ for $R$ as we found $D$ before. Then, $$A R =I\\ R A R =R I=R\\ R A R R^{-1}=R R^{-1}=I\\ R A=I $$ and so, $R$ is also a left inverse.