1

I was reading a textbook and saw the following:

Let $A \in M_{n \times m}(\mathbb{R})$ and $A^t$ the transposed matrix of $A$. It can be proven that $\operatorname{rk}(A) = \operatorname{rk}(AA^t)$.

There is no further information here and I was wondering how to prove this. And also, does this only work for $\mathbb{R}$ or does it work for other fields, e.g. $\mathbb{C}$?

Trevor Gunn
  • 27,041
Johny Hunter
  • 327
  • 1
  • 2
  • 11
  • 1
    This holds for $\mathbb{C}$ if you take tranjugate(conjugate of transpose) instead of transpose. – Sahiba Arora Jun 08 '17 at 16:46
  • Can you give a counterexample in C, when i take the transponse? – Johny Hunter Jun 08 '17 at 16:51
  • @JohnyHunter $\pmatrix{1&i\i&-1}$. – Angina Seng Jun 08 '17 at 16:57
  • Thanks a lot! And sry for the duplicate guys – Johny Hunter Jun 08 '17 at 16:58
  • Usually "transpose" over $\mathbb{C}$ means the conjugate transpose $A^\dagger$ rather than just $A^t$; the latter doesn't correspond to the inner product on $\mathbb{C}$ and isn't as interesting. – anomaly Jun 08 '17 at 17:15
  • This sort of thing also fails for fields with characteristic $p \gt 0$. For example, over the field $\mathbb{Z}/2\mathbb{Z}$ the symmetric matrix $\begin{bmatrix} 1 & 1 \ 1 & 1 \end{bmatrix}$ has rank one, but its square has rank zero. – hardmath Jun 08 '17 at 17:38

2 Answers2

1

This is usually proven as follows:

Consider the bilinear form defined by $\langle x,y \rangle = Ax \cdot Ay =x^T A^T A y $. Because the dot product is positive definite, $\langle x,x \rangle =0$ only when $Ax=0$. However, if $x\in \ker A^TA$, then $x^T (A^TAx)=x\cdot 0 = 0$, Therefore, $\ker A^TA\subset \ker A$. However, in general, $\ker A \subset \ker BA$, and so $\ker A^TA=\ker A$, hence $A$ and $A^T A$ have the same rank.

Replacing $A$ with $A^T$ gives us that $AA^T$ has the same rank as $A^T$, but $A^T$ has the same rank as $A$.

Aaron
  • 24,207
0

Note that $$ \mathrm{Rank}(A)=\dim \{Ax:x\in\mathbb R^m \} $$ and $$ \mathrm{Rank}(A^tA)=\dim \{A^tAx:x\in\mathbb R^m \}\le \{Ax:x\in\mathbb R^m \}. $$ Assume that $u_1,\ldots,u_k$ is a basis of $\{Ax:x\in\mathbb R^m \}$ and $\,u_1=Ax_1,\ldots, u_k=Ax_k,\,$ for some $x_1,\ldots, x_k\in \mathbb R^m$. It suffices to show that $A^tu_1,\ldots,A^tu_k$ are linearly independent. Assume that $$ 0=c_1A^tu_1+\cdots+c_kA^tu_k=c_1A^tAx_1+\cdots+c_kA^tAx_k, $$ then $$ 0=\langle c_1A^tAx_1+\cdots+c_kA^tAx_k, c_1x_1+\cdots+c_kx_k\rangle\\= \langle c_1Ax_1+\cdots+c_kAx_k, c_1Ax_1+\cdots+c_kAx_k\rangle, $$ which implies that $$ 0=c_1Ax_1+\cdots+c_kAx_k=c_1u_1+\cdots+c_ku_k, $$ and since $u_1,\ldots,u_k$ are linearly independent, then $c_1=\cdots=c_k=0$.