I have a question about complex conjugation of a matrix. Prove that for any rectangular matrix $A$ the following holds
rank $A = \text{rank} \, A^*$ where $A^*$ is complex conjugate transpose of $A$.
I have a question about complex conjugation of a matrix. Prove that for any rectangular matrix $A$ the following holds
rank $A = \text{rank} \, A^*$ where $A^*$ is complex conjugate transpose of $A$.
Let $A\in F^{n\times n}.$ If $A$ invertible, then $\bar A$ invertible, therefore $rank(\bar A)=rank(A)$.
Let $U=PA$ where $U$ is the $\textrm{rref}(A)$ and $P$ is invertible.
Now since the number of pivot column(s) of $\bar U$ is the same as $U$, so $\textrm{rank}(\bar U)=\textrm{rank}(U).$
So $\textrm{rank}(A)=\textrm{rank}(PA)=\textrm{rank}(U)=\textrm{rank}(\bar U)=\textrm{rank}(\overline{PA})=\textrm{rank}(\bar P\bar A)=\textrm{rank}(\bar A).$
Here's a straightforward way to show $\text{rank}(A)=\text{rank}(\overline{A})$. After this , Since $A^*=(\overline{A})^T$ , if you already know $\text{rank}(A)=\text{rank}(A^T)$ , it follows that
$$\text{rank}(A^*) = \text{rank}(A^*)^T = \text{rank}(\overline{A}) = \text{rank}(A).$$
Let $A\in M_{m,n}(\mathbb{C})$, and suppose $\text{rank}(A) = r$, so there are $r$ independent columns of $A$. $\text{basis}(A) = \{x _j + y_j:j\in J = \{i_1,\cdots,i_r\} \}$ form the basis of the column space, where $x$ and $y$ contain the real and imaginary parts of the column respectively. Let $U = \{x_{k}+y_{k} : k \in K = \{i_{r+1},\cdots,i_n\} \}$ be the rest of the columns . By linear independence
$$\sum_{j\in J} c_j(x_j+y_j) = 0 \implies c_j=0, \qquad \forall j\in J \tag{1}$$
Since $U \subseteq \text{span}(\text{basis}(A))$
$$x_k+y_k = \sum_{j\in J} c_{k,j}(x_j+y_j), \qquad \forall k \in K \tag{2}$$
Consider $\overline{A}$, which some of its columns are denoted as $\overline{\text{basis}(A)} = \{x _j-y_j:j\in J = \{1,...,r\} \}$. We claim that $\overline{\text{basis}(A)}$ is a basis for the column space of $\overline A$. Let $\sum_{j\in J} b_j(x_j-y_j) = 0 $, and conjugate both sides to get $\sum_{j\in J} \overline{b_j} (x_j+y_j) = 0 $ which implies $ \overline{b_j}=0$ for $\forall j\in J$ by $(1)$. This leads to $b_j=0$ for $\forall j\in J$. So
$$\sum_{j\in J} b_j(x_j-y_j) = 0 \implies b_j=0, \qquad \forall j\in J $$
which means $\overline{\text{basis}(A)}$ is independent. Consider the rest of the columns $\overline{U} = \{x_{k}-y_{k} : k \in K = \{i_{r+1},...,i_{n}\} \}$. Conjugating both sides of $(2)$, we have $x_k-y_k = \sum_{j\in J} \overline{c_{k,j}}(x_j-y_j)$ , so $\overline{U} \subseteq \text{span}(\overline{\text{basis}(A)})$. These two last points imply that $\overline{\text{basis}(A)} = \text{basis}(\overline{A}) $ .
We conclude that
\begin{align} \text{rank}(A) &= \dim(\text{span}(\text{basis}(A))) \\ &= |\text{basis}(A)| \\ &= |\text{basis}(\overline{A})| \\ &= \dim(\text{span}(\text{basis}(\overline{A}))) \\ &= \text{rank}(\overline{A}) \end{align}
Let $\mathcal M$ be the space of matrices of a certain size.
Prove that $f\colon \mathcal M\to \mathcal M, A\mapsto \overline A$ is a transformation that doesn't change the dimension of the nullity.
Then use the Rank-nullity Theorem.