5

I would like to ask for your view on two different conditions for two matrices.

I have the two matrices mentioned above, say $\Lambda_1$ and $\Lambda_2$. Their dimensions are, for both matrices, $N\times P$, and $N>P$. In some papers I am reading, I found that the following condition is needed

$\Lambda_2 \neq \Lambda_1 G$

for all matrices $G$, with $G$ of dimensions $P\times P$. Nothing else is said on $G$.

Working on a related problem, I need the condition

$\Lambda_2=\Lambda_1C + \tilde{\Lambda}_1$

where $C$ is some matrix of dimensions $P\times P$. I do not have any specific conditions on $C$ (e.g. it could have full rank or not, it could be nonzero or all zeros, etc...). On the other hand, $\tilde{\Lambda}_1$ is an $N\times P$ matrix which belongs in the orthogonal space of $\Lambda_1$ - I mean that

$\Lambda_1^{\prime}\tilde{\Lambda}_1=0$

My question is: how different are the two conditions mentioned above? Are they entirely equivalent, do they imply each other, does one imply the other at all...? Any comment would be welcome.

1 Answers1

2

$\newcommand{LT}{\Lambda_2}$ $\newcommand{LO}{\Lambda_1}$We first reduce this to a one-vector problem. Then, we will use one-dimensional theory to investigate the problems.

Let , for a matrix $M$, $M_i$ denote the columns of $M$ where $i$ runs from $1$ to the number of columns of $M$.

Now, what is $\Lambda_1 G$? Recall that $\Lambda_1 G$ can be computed as follows : we compute $\Lambda_1 G_i$ (as the product of a matrix and a vector) for $i=1$ to the number of columns of $G$, and then we put these columns together as a matrix so that $(\Lambda_1 G)_i = \Lambda_1G_i$.

With this, we make the following claim :

There exists a matrix $G$ such that $\Lambda_1 G = \Lambda_2$, if and only if there exist $p$ column-vectors $G_1,...,G_p$ each of dimension $p \times 1$ such that $\Lambda_1G_i = (\Lambda_2)_i$ for all $i=1,2,...,p$. In other words, there is a $G$ such that $\Lambda_1 G = \Lambda_2$ if and only if each of the columns of $\Lambda_2$ lies in the image of $\Lambda_1$ treated as a transformation from $\mathbb R^p \to \mathbb R^n$.

The proof is clear : If $\Lambda_1 G = \Lambda_2$ then we can take the columns of such a $G$. On the other hand, if such vectors existed, we can create a matrix whose columns equal these vectors in the correct order, and such a matrix will satisfy the multiplication condition.

With this, we can now investigate the problem.


We will use an important fact : If $A : \mathbb R^l \to \mathbb R^m$ is a linear transformation, then $\mbox{Im}(A') = [\ker(A)]^\perp$. An answer to this can be found here.

Let's use this to prove :

Suppose that $\LT = \LO C + \widetilde{\LO}$ for a matrix $\widetilde{\LO} \neq 0$ that satisfies $\LO'\widetilde{\LO} = 0$. Then, for no matrix $G$ can it be true that $\LT = \LO G$.

Proof : Since $\widetilde{\LO} \neq 0$, there is a non-zero column of $\widetilde{\LO}$, say $(\widetilde{\LO})_i \neq 0$. Now, suppose that $\LT = \LO G$ for some matrix $G$. Taking the $i$th column on both sides and performing some rearrangement with the help of our result gives : $$ (\LT)_i = (\LO G)_i \implies (\LO C + \widetilde{\LO})_i = (\LO G)_i \implies (\LO C)_i + (\widetilde{\LO})_i = (\LO G)_i \\ \implies \LO (G-C)_i = (\widetilde{\LO})_i $$

Now, in the statement $\mbox{Im}(A') = [\ker(A)]^\perp$, replace $A$ by $A'$, then you get $\mbox{Im}(A) = [\ker(A')]^\perp$. Recognize that $\LO (G-C)_i \in \mbox{Im}(\LO) = [\ker(\LO')]^\perp$ while $(\widetilde{\LO})_i \in \ker(\LO')$. Therefore, the same vector belongs in both $\ker(\LO')$ and its orthogonal complement, but that implies that this vector must equal zero. That's a contradiction, since we chose $i$ such that $(\widetilde{\LO})_i \neq 0$.

Therefore, no such matrix $G$ can exist. $\blacksquare$

So we know for sure that the second condition being true implies that the first condition is true.


Let us now investigate if the opposite direction is true. For this, we will use the fact that for any finite dimensional space $V$ and subspace $W$ of $V$, we have the orthogonal decomposition $V = W \oplus W^\perp$.

In our case, take $W$ to be the subspace $\mbox{Im}(\LO)$, and in that case, $W^{\perp} = \ker(\LO')$. Thus, for any vector $v \in \mathbb R^n$, we can find vectors $v \in \mathbb R^p,w \in \mathbb R^n$ such that $v = \LO v + w$ where $\LO' w = 0$.

Suppose now, that $\LT$ is a given matrix. Using the above with the columns of $\LT$, we know that there exist $v_i \in \mathbb R^p,w_i \in \mathbb R^n$ such that $(\LT)_i = \LO v_i + w_i$ where $\LO' w_i =0$. Putting these together as columns of a matrix equality gives us : $$ \LT = \LO V + W $$

for some $p \times p$ matrix $V$ and $n \times p$ matrix $W$ such that $\LO'W = 0_{p \times p}$. In short, what we have proven is that for every $\LO,\LT$, such a decomposition as provided in the second condition, is possible.

Which means the following :

Suppose that the second condition is false. Then, the first condition is false.

Proof : Suppose that the second condition is false. Then, use the above decomposition to write $\LT = \LO V + W$. Since the second condition is false, it MUST happen that $W = 0$. But then , $\LT = \LO V$ which means the first condition is false. $\blacksquare$


To summarize, we have proven the following.

Theorem : For $n \times p$ matrices $\LO,\LT$, the following are equivalent :

  • For all $p \times p$ matrices $G$ , we have $\LT \neq \LO G$.
  • There exists a $p \times p$ matrix $V$ and a non-zero $n \times p$ matrix $W$ such that $\LT = \LO V + W$ , with $\LO'W = 0_{p \times p}$.
  • thank you ever so much - very very thorough and, if I may, elegant answer which really helps me. really appreciated – Lorenzo Trapani Nov 10 '21 at 07:13
  • 1
    @LorenzoTrapani Thank you! To be honest, being thorough helps me as an answer writer, I don't like missing details however small (although sometimes I use spoilers). Good to have been of help. – Sarvesh Ravichandran Iyer Nov 10 '21 at 08:40