If $A$ is a $m \times n$ matrix and $B$ a $n \times k$ matrix, prove that
$$\text{rank}(AB)\ge\text{rank}(A)+\text{rank}(B)-n.$$
Also show when equality occurs.
If $A$ is a $m \times n$ matrix and $B$ a $n \times k$ matrix, prove that
$$\text{rank}(AB)\ge\text{rank}(A)+\text{rank}(B)-n.$$
Also show when equality occurs.
We claim $\dim \ker\,A+\dim\ker B \geq \dim\ker (AB)$. By using the rank-nullity theorem, we can then deduce that $\text{rank}(AB) ≥ \text{rank}(A) + \text{rank}(B) − n.$
Let $\beta=\{\alpha_1,\dots,\alpha_r \}$ be a basis for $\ker B$. It is not hard to see that $\ker B\subseteq \ker (AB)$ so we can extend $\beta $ to a basis for $\ker (AB)$. Suppose $\{\alpha_1,\dots,\alpha_r,\alpha_{r+1},\dots,\alpha_n \ \}$ be basis for $\ker (AB)$. So $B(\alpha_{i})\neq 0$ for $i \in \{r<i<n+1\}$.
We show that $\{B(\alpha_{r+1}),\dots,B(\alpha_{n})\}$ is linear independent. If we can show that, then we would have $\dim\ker A\geq n-r$. Thus, assume that $\displaystyle\sum_{i=r+1}^n\gamma_iB(\alpha_i)=0$. Since $B$ is linear, we have $B(\displaystyle\sum_{i=r+1}^n\gamma_i\alpha_i)=0$, so that $\displaystyle\sum_{i=r+1}^n\gamma_i\alpha_i$ belongs to the kernel of $B$. On other hand, we already know that $\beta=\{\alpha_1,\dots,\alpha_r \}$ is a basis for the kernel B. Next since the set $\{\alpha_1,\dots,\alpha_r,\alpha_{r+1},\dots,\alpha_n \ \}$ is an independent set, we infer that $\gamma_i$ must be zero for all $i = r+1,...,n$.
Now one can see that $$\dim\ker A+\dim\ker B \geqslant n-r+r =n \Longrightarrow\dim\ker A+\dim\ker B \geqslant \dim\ker (AB).$$
As noted in the other answer, it suffices to show $\dim\ \operatorname{Ker}(A)+\dim\ \operatorname{Ker}(B) \geq \dim\ \operatorname{Ker}(AB)$. This is equivalent to showing that $\dim\ \operatorname{Ker}(AB)/\operatorname{Ker}(B) \leq \dim\ \operatorname{Ker}(A)$. To do this, use the first isomorphism theorem for vector spaces on the linear map $\operatorname{Ker}(AB) \rightarrow \operatorname{Ker}(A)$ defined by $x \mapsto Bx$. This shows that $\operatorname{Ker}(AB)/\operatorname{Ker}(B)$ is isomorphic to a subspace of $\operatorname{Ker}(A)$, which proves the inequality.
Because I can't comment (yet), I will answer @user185640's question in a comment to @Babak Miraftab's answer which is "Why $\{B(\alpha_{r+1}),\ldots,B(\alpha_n)\}$ is linear independent" in this post.
Claim: $\{B(\alpha_{r+1}),\ldots,B(\alpha_n)\}$ is linearly independent.
Proof. Suppose $\sum_{i=r+1}^{n}c_i B(\alpha_i) = 0$ for some $i \in K$ where $K$ is the scalar field.
This would imply that $$B \left(\sum_{i=r+1}^{n}c_i\alpha_i\right) = 0,$$
or equivalently, we have $\sum_{i=r+1}^{n}c_i\alpha_i \in \ker B$. But we know $\beta$ is a basis for $\ker B$ (from @Babak Miraftab's proof). So we can write
$$\sum_{i=r+1}^{n}c_i\alpha_i = \sum_{i=1}^{r}d_i\alpha_i,$$
for some $d_i \in K$, or equivalently,
$$\sum_{i=r+1}^{n}c_i\alpha_i - \sum_{i=1}^{r}d_i\alpha_i = 0.$$
Expanding this sum, we can write it in the following way:
$$c_{r+1} \alpha_{r+1} + \ldots + c_{n} \alpha_{n} + (-d_1)\alpha_1 + \ldots + (-d_r)\alpha_r = 0.$$
Now let's be super explicit and let $e_i =-d_i$ for all $i$ which gives us:
$$c_{r+1} \alpha_{r+1} + \ldots + c_{n} \alpha_{n} + e_1\alpha_1 + \ldots + e_r\alpha_r = 0.$$
Now since $\{\alpha_1, \ldots, \alpha_n\}$ is linearly independent (as it is a basis for $\ker AB$), we must have that
$$c_{r+1} = \ldots = c_n = e_1 = \ldots = e_r = 0.$$
In particular, we have $$c_{r+1} = \ldots = c_n = 0$$
which directly implies that indeed $B(\alpha_{r+1}),\ldots,B(\alpha_n)$ are linearly independent and hence the set containing these vectors is a linearly independent set.
Recall Linear Transformations Isomorphic to Matrix Space.
Using Rank–nullity theorem, $\operatorname{rank}(A)+\operatorname{nullity}(A)=n,\operatorname{rank}(B)+\operatorname{nullity}(B)=k$ and $\operatorname{rank}(AB)+\operatorname{nullity}(AB)=k.$
So, $\operatorname{rank}(A)+\operatorname{rank}(B)+\operatorname{nullity}(A)+\operatorname{nullity}(B)=n+\operatorname{rank}(AB)+\operatorname{nullity}(AB)$
$\implies \operatorname{rank}(AB)-\operatorname{rank}(A)-\operatorname{rank}(B)+n=\operatorname{nullity}(A)+\operatorname{nullity}(B)-\operatorname{nullity}(AB)$
$\geq \operatorname{nullity}(A)$[Since $Bv_2=0$ for $v_2\in Mat_{k\times 1}(F)\implies ABv_2=0$] $\geq 0.$
Proof:
let us consider linear-space:V(A)={AX in R(m) | X is in R(n)}, then we have dim(V(A)) = rank(A)
"need proof inequation <==> rank(A) <= rank(AB) + (n-Rank(B))"
consider linear-space: V(B)={BY in R(n) | Y is in R(k)}, then we have dim(V(B)) = rank(B)
and consider R(n)'s direct sum: R(n) = V(B) + W(B), then dim(W(B)) = n-rank(B)
for any x in R(n), can expressed: x = v + w, then v is in V(B) and w is in W(B)
so, we have Ax = Av + Aw ==> Ax = ABy + Aw
consider linear-space V((AB)y)'s dimension and linear-space V(Aw)'s dimension
because w is in W(B), so W(B)'s base can express V(Aw), ==> dim(V(Aw))<=dim(W(B))
so dim(V(A)) <= dim(V(AB)) + dim(W(B))
then rank(A) <= rank(AB) + (n-rank(B))
so rank(A) + rank(B) - n <= rank(AB) .