Theorem 2.2 Let A be an $m\times n$ matrix, and let $A^1,\cdots,A^n$ be its columns. Then A is invertible $\iff$ $A^1,\cdots,A^n$ are linearly independent.
Here's the proof provided by Serge Lang, but I think there might be some flaws when he proves $\Rightarrow$(marked by $\clubsuit$)
Let $A^1,\cdots,A^n$ be linearly indepdendent, then $\{A^1,\cdots,A^n\}$ is a basis of $K^n$. Let $E^1,\cdots,E^n$ be unit vectors of $K^n$. Then he applies the following theorem:
Theorem Let V and W be vector spaces. Let $\{v_1,\cdots,v_n\}$ be a basis of V, and let $w_1,\cdots,w_n$ be arbitrary elements of W. Then there exists a unqiue linear mapping $T:V\rightarrow W$ s.t. $T(v_i)=w_i$ for each i.
There exists a matrix B s.t. $BA^j=E^j$ for each j. And this is equivalent to saying that $BA=I$. Then he concludes that A is invertible. ($\clubsuit$) I think he should also show that $AB=I$.