$\newcommand{LT}{\Lambda_2}$ $\newcommand{LO}{\Lambda_1}$We first reduce this to a one-vector problem. Then, we will use one-dimensional theory to investigate the problems.
Let , for a matrix $M$, $M_i$ denote the columns of $M$ where $i$ runs from $1$ to the number of columns of $M$.
Now, what is $\Lambda_1 G$? Recall that $\Lambda_1 G$ can be computed as follows : we compute $\Lambda_1 G_i$ (as the product of a matrix and a vector) for $i=1$ to the number of columns of $G$, and then we put these columns together as a matrix so that $(\Lambda_1 G)_i = \Lambda_1G_i$.
With this, we make the following claim :
There exists a matrix $G$ such that $\Lambda_1 G = \Lambda_2$, if and only if there exist $p$ column-vectors $G_1,...,G_p$ each of dimension $p \times 1$ such that $\Lambda_1G_i = (\Lambda_2)_i$ for all $i=1,2,...,p$. In other words, there is a $G$ such that $\Lambda_1 G = \Lambda_2$ if and only if each of the columns of $\Lambda_2$ lies in the image of $\Lambda_1$ treated as a transformation from $\mathbb R^p \to \mathbb R^n$.
The proof is clear : If $\Lambda_1 G = \Lambda_2$ then we can take the columns of such a $G$. On the other hand, if such vectors existed, we can create a matrix whose columns equal these vectors in the correct order, and such a matrix will satisfy the multiplication condition.
With this, we can now investigate the problem.
We will use an important fact : If $A : \mathbb R^l \to \mathbb R^m$ is a linear transformation, then $\mbox{Im}(A') = [\ker(A)]^\perp$. An answer to this can be found here.
Let's use this to prove :
Suppose that $\LT = \LO C + \widetilde{\LO}$ for a matrix $\widetilde{\LO} \neq 0$ that satisfies $\LO'\widetilde{\LO} = 0$. Then, for no matrix $G$ can it be true that $\LT = \LO G$.
Proof : Since $\widetilde{\LO} \neq 0$, there is a non-zero column of $\widetilde{\LO}$, say $(\widetilde{\LO})_i \neq 0$. Now, suppose that $\LT = \LO G$ for some matrix $G$. Taking the $i$th column on both sides and performing some rearrangement with the help of our result gives :
$$
(\LT)_i = (\LO G)_i \implies (\LO C + \widetilde{\LO})_i = (\LO G)_i \implies (\LO C)_i + (\widetilde{\LO})_i = (\LO G)_i \\ \implies \LO (G-C)_i = (\widetilde{\LO})_i
$$
Now, in the statement $\mbox{Im}(A') = [\ker(A)]^\perp$, replace $A$ by $A'$, then you get $\mbox{Im}(A) = [\ker(A')]^\perp$. Recognize that $\LO (G-C)_i \in \mbox{Im}(\LO) = [\ker(\LO')]^\perp$ while $(\widetilde{\LO})_i \in \ker(\LO')$. Therefore, the same vector belongs in both $\ker(\LO')$ and its orthogonal complement, but that implies that this vector must equal zero. That's a contradiction, since we chose $i$ such that $(\widetilde{\LO})_i \neq 0$.
Therefore, no such matrix $G$ can exist. $\blacksquare$
So we know for sure that the second condition being true implies that the first condition is true.
Let us now investigate if the opposite direction is true. For this, we will use the fact that for any finite dimensional space $V$ and subspace $W$ of $V$, we have the orthogonal decomposition $V = W \oplus W^\perp$.
In our case, take $W$ to be the subspace $\mbox{Im}(\LO)$, and in that case, $W^{\perp} = \ker(\LO')$. Thus, for any vector $v \in \mathbb R^n$, we can find vectors $v \in \mathbb R^p,w \in \mathbb R^n$ such that $v = \LO v + w$ where $\LO' w = 0$.
Suppose now, that $\LT$ is a given matrix. Using the above with the columns of $\LT$, we know that there exist $v_i \in \mathbb R^p,w_i \in \mathbb R^n$ such that $(\LT)_i = \LO v_i + w_i$ where $\LO' w_i =0$. Putting these together as columns of a matrix equality gives us :
$$
\LT = \LO V + W
$$
for some $p \times p$ matrix $V$ and $n \times p$ matrix $W$ such that $\LO'W = 0_{p \times p}$. In short, what we have proven is that for every $\LO,\LT$, such a decomposition as provided in the second condition, is possible.
Which means the following :
Suppose that the second condition is false. Then, the first condition is false.
Proof : Suppose that the second condition is false. Then, use the above decomposition to write $\LT = \LO V + W$. Since the second condition is false, it MUST happen that $W = 0$. But then , $\LT = \LO V$ which means the first condition is false. $\blacksquare$
To summarize, we have proven the following.
Theorem : For $n \times p$ matrices $\LO,\LT$, the following are equivalent :
- For all $p \times p$ matrices $G$ , we have $\LT \neq \LO G$.
- There exists a $p \times p$ matrix $V$ and a non-zero $n \times p$ matrix $W$ such that $\LT = \LO V + W$ , with $\LO'W = 0_{p \times p}$.