Consider $A \in \mathbb{K^{m\times n}}$. I will use the following notation:
- column space of $A \equiv \text{col}(A) = \{ x \in \mathbb{K^m}: x=Ay, \text{ for some } y\in \mathbb{K^n} \}$;
- null space of $A \equiv \text{ker}(A) = \{ y \in \mathbb{K^n}: Ay=0\}$;
- rank of $A \equiv \text{rk}(A) = \text{dim} \; \text{col}(A)$.
Let's start by proving a very useful equality.
Theorem 1
If $A \in \mathbb{K^{m\times n}}$ and $B \in \mathbb{K^{n\times r}}$ then
$$\text{rk}(AB)=\text{rk}(B) - \text{dim} \; ({ \text{ker}(A) \cap \text{col}(B)}).$$
Proof:
Take $S=\{x_1,...,x_s\}$ as a basis for $\text{ker}(A) \cap \text{col}(B)$ and note that $({ \text{ker}(A) \cap \text{col}(B)}) \subseteq \text{col}(B)$.
If $\text{dim} \; \text{col}(B) = s+t$, then we can find an extension set $S_e=\{ z_1,\ldots,z_t \}$ such that $U = \{ x_1, \ldots, x_s,z_1, \ldots, z_t\}$ is a basis for $\text{col}(B)$. Then, we just have to prove that $\text{dim}\;\text{col}(AB) = t$, which we can do by showing that $T=\{ Az_1, \ldots, Az_t \}$ is a basis for $\text{col}(AB)$.
In fact, we have that:
- If $b \in \text{col}(AB)$ then $\exists y\in \mathbb{K^r}: b=ABy$. Now,
$$By \in \text{col}(B) \Rightarrow By= \sum\limits_{i=1}^{s} {\xi_i x_i} + \sum\limits_{i=1}^{t} {\eta_i z_i}$$
so
$$b=A \left( {\sum\limits_{i=1}^{s} {\xi_i x_i} + \sum\limits_{i=1}^{t} {\eta_i z_i}} \right) = \sum\limits_{i=1}^{s} {\xi_i \underbrace{Ax_i}_{={\bf{0}}}} + \sum\limits_{i=1}^{t} {\eta_i Az_i} = \sum\limits_{i=1}^{t} {\eta_i Az_i}.$$
Hence, $T$ spans $\text{col}(AB)$.
- If ${\bf{0}} = \sum\limits_{i=1}^{t} {\alpha_i Az_i} = A \left( \sum\limits_{i=1}^{t} {\alpha_i z_i} \right)$, then $\sum\limits_{i=1}^{t} {\alpha_i z_i} \in \text{ker}(A) \cap \text{col}(B)$, so there are scalars $\beta_j$ such that
$$\sum\limits_{i=1}^{t} {\alpha_i z_i} = \sum\limits_{j=1}^{s} {\beta_j z_j} \Leftrightarrow \sum\limits_{i=1}^{t} {\alpha_i z_i} - \sum\limits_{j=1}^{s} {\beta_j z_j} = {\bf{0}}.$$
Hence, recalling that $U$ is a basis for $\text{col}(B)$, therefore forming a linearly independent set, $\alpha_i=\beta_j=0$, so we conclude that $T$ is also a linearly independent set.
Thus $T$ is a basis for $\text{col}(AB)$, so $t= \text{dim} \; \text{col}(AB) = \text{rk}(AB)$, and we finally get
$$\text{rk}(B) = \text{dim} \; \text{col}(B) = s + t = \text{dim} \; ({ \text{ker}(A) \cap \text{col}(B)}) + \text{rk}(AB).$$
Q.E.D.
i)
Now, let's prove that $\text{rk}(AB) \leq \text{min} \{ \text{rk}(A),\text{rk}(B) \}$.
Resorting to Theorem 1, we have
$$\tag{1} \text{rk}(AB)=\text{rk}(B) - \text{dim} \; ({ \text{ker}(A) \cap \text{col}(B)}) \leq \text{rk}(B).$$
Recalling that transposition does not alter rank, and again using Theorem 1, we get
$$\tag{2} \text{rk}(AB)=\text{rk}(AB)^T = \text{rk}( B^T A^T) = \underbrace{\text{rk}(A^T)}_{=\text{rk}(A)} - \text{dim} \; ({ \text{ker}(B^T) \cap \text{col}(A^T)}) \leq \text{rk}(A).$$
From (1) and (2), we're able to conclude
$$\text{rk}(AB) \leq \text{min} \{ \text{rk}(A),\text{rk}(B) \}.$$
ii)
To prove $\text{rk}(A) + \text{rk}(B) - n \leq \text{rk}(AB)$, recall that if $X$ and $Y$ are vector spaces such that $X \subseteq Y$ then $\text{dim} \;X \leq \text{dim} \;Y$, and note that $\text{ker}(A) \cap \text{col}(B) \subseteq \text{ker}(A)$. We then have
$$\text{dim} \; (\text{ker}(A) \cap \text{col}(B)) \leq \text{dim} \; \text{ker}(A) \mathop{=}^{\text{R-N}} n - \text{rk}(A)$$
where we have resorted to the Rank-Nullity Theorem (R-N) to get the last equality.
Plugging the last expression into Theorem 1, we arrive at
$$\text{rk}(AB)=\text{rk}(B) - \text{dim} \; (\text{ker}(A) \cap \text{col}(B)) \geq \text{rk}(B) + \text{rk}(A) - n.$$