1

It is easily proved by using the notion of determinant.But without using determinant, how to prove it? Actually,this proposition is from section 2.3 0f Linear Algebra and Its Applications by Lay D.C.The notion of "rank" is not taught before section 2.4. What has been taught is basic concept 0f linear transformation(including surjective and injective) and matrix multiplication. Thanks in advance!

Man Big
  • 91
  • Depends on what you know: for example $rank(AB)\leq \min(rank(A), rank(B))$? Or another proof: multiplication corresponds to composition of linear maps: if $A\circ B=id$, then clearly $A$ is surjective and $B$ injective, which means... – Peter Franek Feb 28 '16 at 07:38
  • @PeterFranek how to know A is surjective and B is injective from AB=I – Man Big Feb 28 '16 at 08:17
  • @ManBig If $B(x)=B(y)$, then $x=AB(x)=AB(y)=y$. That proves injectivity of $B$. For surjectivity, any $y$ equals $AB(y)$ so it is $A$ applied to $B(y)$ and hence $A$ is surjective. – Peter Franek Feb 28 '16 at 13:05

2 Answers2

1

One way to prove this result using linear transformation: let $f$ and $g$ two linear transformations from $\Bbb R^n$ to $\Bbb R^n$ such that $A$ and $B$ are their matrices respectively. We have $f\circ g=\operatorname{id}$. If $x\in\ker g$ then we see easily that $x\in\ker f\circ g=\{0\}$ hence $$\ker g\subset\ker f\circ g\implies \ker g=\{0\}$$ hence $g$ is injective and by the rank-nullity theorem $g$ is bijective. We conclude that both $f$ and $g$ are bijective.

Edit$\quad$ To prove that $g$ is surjective without using the rank and so without using the rank-nullity theorem we can do as follows:

Let $(e_1,\ldots,e_n)$ a basis of $\Bbb R^n$ so these vectors are linearly independent then since $g$ is injective we prove easily that the vectors $(g(e_1),\ldots,g(e_n))$ are linearly independent and then they form a basis of $\Bbb R^n$. Now let $y\in\Bbb R^n$ then there are $y_1,\ldots,y_n\in\Bbb R$ such that

$$y=y_1g(e_1)+\cdots+y_ng(e_n)=g(y_1e_1+\cdots+y_ne_n)=g(x)$$ hence $g$ is surjective.

user296113
  • 7,570
1

An alternative version of user296113's answer, using linear systems instead of transformations:

Consider the homogeneous system $BX=0$. If $X$ is a solution, then $X = (AB)X = A(BX) = A0 = 0$. Hence the homogeneous system $BX=0$ has only the trivial solution $X=0$ $\longleftrightarrow$ $rref(B)$ has a pivot on every column $\longleftrightarrow$ $rref(B) = I_n$ (because $B$ is a square matrix) $\longleftrightarrow$ $B$ is invertible. Multiply $AB=I_n$ to the right by $B^{-1}$ and get $A = B^{-1}$, hence $A$ is invertible as well.

Catalin Zara
  • 6,187