6

I came across the following Question

Assume an $n\times n$ matrix that has exactly one $1$ and one $-1$ in each row and column and others are $0$. Prove that there is a way that we can change the places of rows and columns in which it gives the negative of the matrix.

MY TRY- Call such matrix A. All we need to do is to find some permutation matrices $P_{1}$ and $P_{2}$ such that $$P_{1}AP_{2} = -A$$ $A$ can be written as a difference of two permutation matrices i.e. $$A = P-Q$$ where P and Q are some permutation matrices

Example of one such matrix of order $3\times3$ $$ \begin{pmatrix} 1 & 0 & -1 \\ -1 & 1 & 0 \\ 0 & -1 & 1 \end{pmatrix} = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}-\begin{pmatrix} 0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \end{pmatrix}.$$

We could first turn every such matrix $A$ by multiplying by appropriate permutation matrices to the form $I-R$ :-$$P^{T}A = P^{T}(P-Q) = I-R$$ Clearly the permutation matrix R shouldn't have $1$ at the same position as in $I$. R lies in the class of traceless permutation matrices. Now If we are able to find matrices permutation $P_{1}$ and $P_{2}$ such that $$P_{1}(I-R)P_{2} = (R-I) = -(I-R)$$ we'll have
$$P_{1}P^{T}AP_{2} = -P^{T}A \implies PP_{1}P^{T}AP_{2} = -A $$ and we would be done.
But how could I proceed now to find $P_{1}$ and $P_{2}$?
Would we need some extra equation from the fact that $R$ is a traceless permutation matrix?
It was great to see other approaches to solve the problem by Michael Hoppe and user1551.
But I am curious to see how would it be if we go this way?

ImBatman
  • 378
  • the missing link for your problem is here: https://math.stackexchange.com/questions/48134/why-are-two-permutations-conjugate-iff-they-have-the-same-cycle-structure really this isn't a linear algebra problem but a combinatorics or group theory problem. – user8675309 Jun 20 '20 at 01:19
  • @user8675309 What did you mean by missing link? Two conjugate permutations might have 1 at same positions. for e.g. (2 5) and (1 2). Did you mean two permutaions matrices which do not have 1 at same positions represent conjugate permutations? – ImBatman Jun 25 '20 at 08:35
  • 1
    in the language of matrices -- that link tells you that a permutation matrix is in the same conjugacy class as another iff the two matrices have the same characteristic polynomial (ref. newton's identities if helfpul). Where I consider the conjugacy only via permutation matrices. Permutation matrix $P$ and $P^T$ have the same char poly so they are conjugate -- thus for permutation matrix Q we have $Q^T PQ = P$ and $Q^T\big(I-P\big) = I -P^T$ – user8675309 Jun 25 '20 at 16:51
  • @user8675309 Did you mean $Q^{T}PQ = P^{T}$ and $Q^{T}(I-P)Q = I-P^{T}$? – ImBatman Jul 03 '20 at 15:58
  • yes -- a couple of bad typos at the end of my comment... – user8675309 Jul 03 '20 at 17:19

4 Answers4

4

Call your matrix $A$. If we remove all $-1$s in $A^\top$, we obtain a permutation $P$. Then all diagonal entries of $B=PA$ are equal to $1$.

Define a directed graph $G$ with $n$ nodes $1,2,\ldots,n$, such that node $i$ is connected to node $j$ if and only if $b_{ij}=-1$. Since each row of $B$ contains exactly one $-1$, the graph $G$ can be partitioned into some $m$ disjoint cycles of lengths $l_1,l_2,\ldots,l_m$ respectively. That is, there exists a permutation $\sigma\in S_n$ such that $G$ consists of the cycles \begin{aligned} &\sigma(1)\to\sigma(2)\to\cdots\to\sigma(l_1)\to\sigma(1),\\ &\sigma(l_1+1)\to\sigma(l_1+2)\to\cdots\to\sigma(l_1+l_2)\to\sigma(l_1+1),\\ &\sigma\left(\sum_{k=1}^2l_k+1\right)\to\sigma\left(\sum_{k=1}^2l_k+2\right)\to\cdots\to\sigma\left(\sum_{k=1}^2l_k+l_3\right)\to\sigma\left(\sum_{k=1}^2l_k+1\right)\\ &\cdots\\ &\sigma\left(\sum_{k=1}^{m-1}l_k+1\right)\to\sigma\left(\sum_{k=1}^{m-1}l_k+2\right)\to\cdots\to\sigma\left(\sum_{k=1}^{m-1}l_k+l_m\right)\to\sigma\left(\sum_{k=1}^{m-1}l_k+1\right). \end{aligned} It follows that if we define a permutation $Q$ such that $Q_{i,\sigma(i)}=1$ for each $i$, then $D=QBQ^\top=C_1\oplus C_2\oplus\cdots\oplus C_k$, where each $C_i$ is a circulant matrix of the following form: $$ C_i=\pmatrix{1&-1\\ &1&-1\\ &&\ddots&\ddots\\ &&&1&-1\\ -1&&&&1}. $$ If $C_i$ has $n_i$ rows, flip $I_{n_i-1}$ from left to right to obtain an $(n_i-1)\times(n_i-1)$ matrix $S_i$. Then $$ \pmatrix{1\\ &S_i}C_i\pmatrix{0&1\\ I_{n_i-1}&0}\pmatrix{1\\ &S_i}=-C_i. $$ It follows that there exists two permutation matrices $R_1$ and $R_2$ such that $R_1DR_2=-D$. Thus $$ R_1QPAQ^\top R_2 =R_1QBQ^\top R_2 =R_1DR_2 =-D =-QBQ^\top =-QPAQ^\top, $$ i.e. $$ (P^\top Q^\top R_1QP)A(Q^\top R_2Q)=-A.\tag{1} $$


Illustrative example. Consider the example in Michael Hoppe's answer: $$ A=\begin{pmatrix} -1 & 0 & 1 & 0\\ 0 & -1 & 0 & 1\\ 0 & 1 & -1 & 0\\ 1 & 0 & 0 & -1 \end{pmatrix}. $$ Note that $$ P=\begin{pmatrix}0&0&0&1\\ 0&0&1&0\\ 1&0&0&0\\ 0&1&0&0\end{pmatrix} \Rightarrow B=PA=\pmatrix{1&0&0&-1\\ 0&1&-1&0\\ -1&0&1&0\\ 0&-1&0&1}. $$ The graph $G$ is a single cycle $1\to4\to2\to3\to1$. Let $\sigma(1)=1,\sigma(2)=4,\sigma(3)=2$ and $\sigma(4)=3$. Then $$ Q=\pmatrix{1&0&0&0\\ 0&0&0&1\\ 0&1&0&0\\ 0&0&1&0} \Rightarrow QBQ^\top=D=\pmatrix{1&-1\\ &1&-1\\ &&1&-1\\ -1&&&1}. $$ Finally, $$ \pmatrix{1&0&0&0\\ 0&0&0&1\\ 0&0&1&0\\ 0&1&0&0} D \pmatrix{0&0&0&1\\ 1&0&0&0\\ 0&1&0&0\\ 0&0&1&0} \pmatrix{1&0&0&0\\ 0&0&0&1\\ 0&0&1&0\\ 0&1&0&0}=-D. $$ Thus $(1)$ gives $$ \pmatrix{0&1&0&0\\ 1&0&0&0\\ 0&0&1&0\\ 0&0&0&1} A \pmatrix{0&0&0&1\\ 0&0&1&0\\ 0&1&0&0\\ 1&0&0&0}=-A. $$

user1551
  • 139,064
  • As far as I can make out from the above, $Q = I-B$ and $QBQ^T = Q(I-Q)Q^T = Q(Q^T-I) = I-Q = B$, so I don't see how you obtain the circulant matrix. – copper.hat Jun 22 '20 at 03:31
  • Perhaps I am missing something, but it is easy to start with $I-R$ where $R$ is a permutation (with no cycles of length one) such that $I-R$ is not circulant. Then $Q=R$. – copper.hat Jun 22 '20 at 03:53
  • @copper.hat Pleas see the example in my new edit. In that example, $I-B$ is not circulant. – user1551 Jun 22 '20 at 03:57
  • My apologies, I misinterpreted how you defined $\sigma$, in your example above I was defining $\sigma(1) = 4, \sigma(2) = 3, ...$. Sorry to waste your time. – copper.hat Jun 22 '20 at 04:05
  • @user1551 The point $D = QBQ^{T} = C_{1} \oplus C_{2}....C_{k}$ is slight unclear to me. What does $C_{i}$ denote for e.g. what is $C_{2}$? And how did you reach to the conclusion that D can be written as a sum of such circulant matrices? – ImBatman Jun 25 '20 at 09:01
2

Not a solution, but a direction in which to go

Your idea of "difference of permutations" is a nice one for describing these "good" matrices, but as you observe, it doesn't, in its current form, seem to be leading you anywhere.

You've said that not every difference of permutations is "good", and that's true. And you want to find a property that's characterizes the ones that are good. And you've actually identified the property: they never have a "1" in the same position.

Now if you have a difference of permutations that's "good", and you left-multiply by a permutation, you STILL have a difference of permutations, i.e., $P_1(P-Q) = (P_1P) - (P_1 Q)$. The only question is "do the matrices $P_1P$ and $P_1Q$ still have the "no 1s in the same position" property.

(You then have to do the same thing for right-multiplying, but that'll be easy of the left-multiply thing works out).

So here's a lemma to prove:

If $A, B, P$ are permutations, and $A$ and $B$ have no $1$s in the corresponding positions, then $PA$ and $PB$ have no $1$s in corresponding positions either.

That should get you going.

John Hughes
  • 93,729
  • @JohnHuges Thanks for such a nice result. Now all it needs to be done is to prove it for matrices of the form $(I-P)$ .Because all others may be written as a permutation of the matrices of the given form. But it still unclear how could we choose matrix P and $P_{1}$ and $P_{2}$ such that $P_{1}(I-P)P_{2} = (P-I)$ – ImBatman Jun 19 '20 at 06:28
2

As said in the question, it is enough to work with $A=I-R$, where $R$ is a permutation matrix without 1 on the diagonal. Suppose that $R$ is the matrix of the permutation $p$. We will show later that every permutation is a product of two involutions, that is, we can write $p=fg$ where $f^2=g^2=id$. (Here a product $fg$ maps $i$ to $f(g(i))$ for all $i$). If $F,G$ are the matrices corresponding to $f,g$, then we have $R=FG$ and $F^2=G^2=I$. Then the statement follows from the fact that $$F(I-R)G=FG-F^2G^2=R-I=-(I-R).$$ It remains to show that every permutation $p$ is the product of two involutions. We can write $p=c_1c_2...c_k$ as a product of disjoint cycles $c_j$ (see here). Therefore it is sufficient to write cycles as a product of two involutions. Furthermore it is sufficient to do this for the cycle corresponding to the mapping $c:i\mapsto i+1 \mod m$. Here we can write it as a product $c=fg$ where $f:i\mapsto m+1-i\mod m$ and $g:i\to m-i \mod m$. More explicitly, a cycle $c=(a_1\,a_2\,\dots\,a_m)$ is the product $c=fg$ of the involutions $$f=\begin{pmatrix}a_1&a_2&\dots &a_m\\a_m&a_{m-1}&\dots &a_1\end{pmatrix} \mbox{ and }g=\begin{pmatrix}a_1&a_2&\dots&a_{m-1} &a_m\\a_{m-1}&a_{m-2}&\dots &a_1&a_m\end{pmatrix}.$$ The factorisations of different cycles in the product $f=c_1c_2\dots c_k$ do not interfere with each other as they concern different sets.

This completes the proof.

Helmut
  • 4,944
1

Here's an algorithm to transform the matrices. I'll explain by an example.

We want to transform $$ \begin{pmatrix} -1 & 0 & 1 & 0\\ 0 & -1 & 0 & 1\\ 0 & 1 & -1 & 0\\ 1 & 0 & 0 & -1 \end{pmatrix}\quad\text{to}\quad \begin{pmatrix} 1 & 0 & -1 & 0\\ 0 & 1 & 0 & -1\\ 0 & -1 & 1 & 0\\ -1 & 0 & 0 & 1 \end{pmatrix}. $$ We may define the companion of the first matrix as $$\begin{pmatrix} 1 & 2 & 3 & 4\\ 4 & 3 & 1 & 2 \end{pmatrix}$$ where the companion's first column gives the position of $-1$ and $1$ in the first column of the matrix resp., that is $\left(\begin{smallmatrix}1\\4\end{smallmatrix}\right)$ and so on.

Now changing two columns in the matrix changes the correspondent columns in the companion; exchanging two rows $j$ and $k$ in the matrix exchanges all values of $j$ and $k$ in the companion.

We want to go from $$\begin{pmatrix} 1 & 2 & 3 & 4\\ 4 & 3 & 1 & 2 \end{pmatrix}\quad\text{to}\quad\begin{pmatrix} 4 & 3 & 1 & 2\\ 1 & 2 & 3 & 4 \end{pmatrix}.$$ Start with exchanging the first and last column of the companion: $$\begin{pmatrix} 4 & 2 & 3 & 1\\ 2 & 3 & 1 & 4 \end{pmatrix}.$$ Now the first column should be $\left(\begin{smallmatrix}4\\1\end{smallmatrix}\right)$, hence we swap row $1$ and $2$ to obtain $$\begin{pmatrix} 4 & 1 & 3 & 2\\ 1 & 3 & 2 & 4 \end{pmatrix}.$$ Repeat the process with the companion's second column, that is switch columns $2$ and $3$: $$\begin{pmatrix} 4 & 3 & 1 & 2\\ 1 & 2 & 3 & 4 \end{pmatrix}$$ and we're done already. That was easily done, now another example for the systematic way

Take the companion $$\begin{pmatrix} 2 & 3 & 1 & 5 & 4\\ 4 & 5 & 3 & 1 & 2 \end{pmatrix}. $$ Look for cycles in the permutation, there are two, namely $(3,5,1)$ and $(2,4)$. Now first change $3$ to $5$, that is, exchange row $3$ with row $5$ in the corresponding matrix to get $$\begin{pmatrix} 2 & 5 & 1 & 3 & 4\\ 4 & 3 & 5 & 1 & 2 \end{pmatrix}, $$ then exchange $5$ and $1$: $$\begin{pmatrix} 2 & 1 & 5 & 3 & 4\\ 4 & 3 & 1 & 5 & 2 \end{pmatrix} $$ and for the first cycle finally $1$ and $3$: $$\begin{pmatrix} 2 & 3 & 5 & 1 & 4\\ 4 & 1 & 3 & 5 & 2 \end{pmatrix}. $$ For the second cycle exchange $2$ and $4$ $$\begin{pmatrix} 4 & 3 & 5 & 1 & 2\\ 2 & 1 & 3 & 5 & 4 \end{pmatrix}.$$ Now change the columns for the correct order: $$\begin{pmatrix} 4 & 5 & 3 & 1 & 2\\ 2 & 3 & 1 & 5 & 4 \end{pmatrix}.$$ Done!

For the first example we could have perform the changes $1\leftrightarrow4$, $4\leftrightarrow2$, $2\leftrightarrow3$, and $3\leftrightarrow1$ and then switch the columns accordingly, but there obviously was an easier way.

Michael Hoppe
  • 18,103
  • 3
  • 32
  • 49