0

Theorem 2.2 Let A be an $m\times n$ matrix, and let $A^1,\cdots,A^n$ be its columns. Then A is invertible $\iff$ $A^1,\cdots,A^n$ are linearly independent.

Here's the proof provided by Serge Lang, but I think there might be some flaws when he proves $\Rightarrow$(marked by $\clubsuit$)

Let $A^1,\cdots,A^n$ be linearly indepdendent, then $\{A^1,\cdots,A^n\}$ is a basis of $K^n$. Let $E^1,\cdots,E^n$ be unit vectors of $K^n$. Then he applies the following theorem:

Theorem Let V and W be vector spaces. Let $\{v_1,\cdots,v_n\}$ be a basis of V, and let $w_1,\cdots,w_n$ be arbitrary elements of W. Then there exists a unqiue linear mapping $T:V\rightarrow W$ s.t. $T(v_i)=w_i$ for each i.

There exists a matrix B s.t. $BA^j=E^j$ for each j. And this is equivalent to saying that $BA=I$. Then he concludes that A is invertible. ($\clubsuit$) I think he should also show that $AB=I$.

Shuyi Leo
  • 175
  • If it was proved earlier that row rank equals column rank then you know rows are also linearly independent and you can use the same argument to achieve $AC = I$ for some $C$ and when inverse exist on both sides they must be equal. Without using the independence of rows I am not sure if it is possible to argue it. – Tony Pizza Oct 14 '23 at 07:27
  • 1
    https://math.stackexchange.com/questions/3852/if-ab-i-then-ba-i – tommy1996q Oct 14 '23 at 09:31

0 Answers0