1

I tried to prove that if $A$ and $B$ are both $n\times n$ matrices and if $AB = I_n$ then $BA = I_n$ (i.e. the matrix $A$ is invertible). So first I managed to conclude that if exists both $B$ and $C$ such that $AB = I_n$ and $CA = I_n$, then trivially $B=C$ . However to conclude the proof we need to show that if such a right inverse exists, then a left inverse must exist too.

No idea how to proceed. All I can use is definition of matrices, and matrix multiplication, sum , transpose and rank.

(I saw proof of this in other questions, but they used things like determinants or vectorial spaces, but I need a proof without that).

Ruben a
  • 35
  • 1
  • 4

2 Answers2

5

A matrix $A\in M_n(\mathbb{F})$ has a right inverse $B$ (which means $AB=I$) if and only if it has rank $n$. I assume you know that. So now you need to prove that $BA=I$. Well, let's multiply the equation $AB=I$ by $A$ from the right side. We get $A(BA)=A$ and hence $A(BA-I)=0$. Well, now we can split the matrix $BA-I$ into columns. Let's call its columns $v_1,v_2,...,v_n$ and so this way we get $Av_1=0,Av_2=0,...,Av_n=0$. But because the rank of $A$ is $n$ we know that the system $Ax=0$ can have only the trivial solution. Hence $v_1=v_2=...=v_n=0$, so $BA-I$ is the zero matrix and hence $BA=I$.

Mark
  • 39,605
1

Let $B$ be a right inverse of $A$, then $AB=I$. The columns of $I$ are in the column space of $A$ ($\operatorname{Col}A$). Since $Ix=0$ only if $x=0$, the column vectors of $I$ are linearly independent and form a basis of $\mathbb{R}^n$ as there are $n$ of them. This means that $\mathbb{R}^n = \operatorname{col} A$. Since there are $n$ column vectors of $A$ that span $\mathbb{R}^n$, they must be independent hence $Av=0$ implies $v=0$. For all vectors $x$ in $\mathbb{R}^n$, $BAx=y$, a vector in $\mathbb{R}^n$. Therefore, $A(BA) x= (AB) Ax=Ax=Ay$, so $A(x-y) =0$ implies $x=y$. Hence $BA=I$.

$B$ is also unique for if $AB=AC=I$, then $BAB=BAC$ implies $B=C$