Is there a way of proving this from ground up? namely using brute force computation entries by entries to show this?
-
Hint: row rank = column rank and if all columns are linearly independent, then column rank = $n$ for $A\in \mathbb{R}^{n \times n}$. – YukiJ May 15 '18 at 10:11
2 Answers
Maybe this is what you mean with "from ground up"
If the rows of $A$ are linearly independent, then doing row-reduction to $A$ gives the dentity matrix, so the only solution of $Av=0$ is $v=0$.
If the columns of $A$ are linearly dependent, say, $$a_1c_1+a_2c_2+\cdots+a_nc_n=0$$ where the $c_i$ are the columns and the $a_i$ are not all zero, then $Av=0$ where $$v=(a_1,a_2,\dots,a_n)\ne0$$
So, if the columns are dependent, then so are the rows.
Now apply the same argument to $A^T$ to conclude that if the rows of $A$ are dependent then so are the columns.

- 120
-
I guess I would like to be convinced of your first statement, why do row vectors being linearly independent implies v is the only solution to Av = 0 via row reduction? – Ecotistician May 15 '18 at 18:41
-
Let $K$ be a field. $\renewcommand\Im{\operatorname{Im}}$ Consider a $n\times m$-matrix as a $K$-linear mapping $\varphi:K^m\to K^n$. The column rank is the dimension $\dim_K(\Im\varphi)$, while the row rank is the dimension $\dim_K(\Im\varphi^\ast)$ of the transpose $\varphi^\ast$ of $\varphi$. Recall that we have a $K$-linear isomorphism \begin{align} &(\Im\varphi)^\ast\xrightarrow\sim\Im\varphi^\ast& &\lambda\mapsto\lambda\circ\varphi \end{align} which implies \begin{align*} \dim(\Im\varphi)&=\dim(\Im\varphi)^\ast\\ &=\dim(\Im\varphi^\ast) \end{align*} thus proving our assertion.

- 16,054