4

I was looking at Gilbert Strang's lectures on Linear Algebra and noticed that in lecture 2, Elimination with Matrices, around the 40nth minute he mentions that you can use the permutation matrix, $$P= \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} $$ so that $AP$ is a permutation of $A$'s columns and $PA$ is a permutation of $A$'s rows.

I was wondering if there exists a matrix $P'$ such that $P'A$ is a permutation of $A$'s columns and $AP'$ is a permutation of $A$'s rows.

  1. How to prove there is no $P'$ such that $P'A=AP$ and $AP'=PA$ for all $n\times n$ matrices?
  2. For which matrices $A$ there is such a $P'$?

Tried the $2\times 2$ case

$L$ pre-multiplies $A$ and permutes its columns, $$ \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix} \begin{bmatrix} a & b \\ c & d \end{bmatrix}=\begin{bmatrix} b & a \\ d & c \end{bmatrix} \Rightarrow \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix}=\frac{1}{ad-bc}\begin{bmatrix} bd-ac & a^2-b^2 \\ d^2-c^2 & ac-bd \end{bmatrix}=L $$

$R$ post-multiplies $A$ and permutes its rows,

$$ \begin{bmatrix} a & b \\ c & d \end{bmatrix} \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix}= \begin{bmatrix} c & d \\ a & b \end{bmatrix} \Rightarrow \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix}=\frac{1}{ad-bc}\begin{bmatrix} cd-ab & d^2-b^2 \\ a^2-c^2 & ab-cd \end{bmatrix}=R $$

  • $L\neq R$ in the general case
  • $det(A)\neq 0$ for $R$ and $L$ to exist
  • if $det(A)\neq 0$ and $a=d=0$, then $P'=L=R$

Additional notes

I feel I have to clarify further. I was looking for a matrix $P'$ that behaves just like $P$ but from the opposite side, meaning $P'$ permutes columns when pre-multiplying $A$ and rows when post-multiplying $A$.

giannisl9
  • 123
  • Are you asking to find $P^\prime$ such that for any matrix $A$, $P^\prime A$ permutes columns and $AP^\prime$ permutes rows? Clearly such a $P^\prime$ exists for specific cases of $A$, but not for arbitrary $A$. – parsiad Feb 14 '19 at 03:44
  • I tried to find it for the general $2\times 2$ case but it seems there is no such $P'$. So, I am asking how to prove that for the general $n\times n$ case and if there is any formula that gives $P'$ such $P'A$ permutes columns and $P''$ such $AP''$ permutes rows. – giannisl9 Feb 14 '19 at 03:48
  • Not necessary $det(A)\neq 0$ for $R$ and $L$ to exist. For instanse, if $A=0$ then $LA=A=AR$ for any ($n\times n$) matrices $L$ and $R$. – Alex Ravsky Dec 15 '20 at 03:18

4 Answers4

5

$\newcommand\bm\boldsymbol$

If such $\bm P'$ works for all matrices, then for all $\bm A$, $$ \bm {AP} = \bm {P'A}, $$ then specifically it works for the identity matrix $\bm I$, i.e. $$ \bm {IP} = \bm {P'I}, $$ then the only candidate of $\bm P'$ is $\bm P$ again. But clearly $$ \bm {PA} = \bm {AP} $$ only holds for specific matrices $\bm A$.

Conclusion: maybe for some $\bm A$, there exists $\bm P'$ that $\bm {P'A}$ swap two columns of $\bm A$, but there exists no universal matrices $\bm P'$.

xbh
  • 8,835
  • Thanks for the answer! Sure that $P'$ does not always exist but what happens when it does? How to prove that $AP'$ is not a permutation of $A$'s rows as it would happen with $P$ but from the opposite side each time. – giannisl9 Feb 14 '19 at 05:14
  • I rephrased the question following my understanding of what is going on. Your answer covers part 1 perfectly! May you please include an answer to the second part as well? – giannisl9 Feb 16 '19 at 08:05
  • @giannisl9 Sorry, that maybe out of my reach. However for simple cases, say $2\times 2$ matrices, those symmetric $\boldsymbol A$'s would have such $\boldsymbol P' = \boldsymbol P$. For general cases, I don't think that much, and it would be a pretty difficult problem. Sorry again. – xbh Feb 16 '19 at 08:25
1

If there is an ($n\times n$) matrix $P’$ such that $P'A=AP$ and $AP'=PA$ then $PA^2=AP’A=A^2P$. The latter holds if $A^2$ is a linear combination of powers of $P$. Moreover, by equivalences (I) $\Leftrightarrow$ (IV) and (I) $\Leftrightarrow$(V) of Theorem from Will Jagy’s answer, the set of matrices commuting with $P$ coincides with the set of linear combinations of powers of $P$ iff the permutation represented by $P$ is a cycle. Conversely, if $PA^2=A^2P$ and $\det A\ne 0$ we have $P'A=AP$ and $AP'=PA$ for $P’=APA^{-1}$.

Alex Ravsky
  • 90,434
0

Let's start with the case of $PA$. Let $\pi$ be the permutation function which generates $P=(p_{ij})$. By this, I mean $p_{ij}=1$ if and only if $\pi(j)=i$. Next, note that $$ (PA)_{ij}=\sum_{k}p_{ik}a_{kj}=a_{\pi(j)j}. $$ That is, $PA$ is exactly the matrix which has its rows permuted according to $\pi$. Therefore, for any dimensions $n\geq2$, picking $A$ such that all of its entries are distinct yields the desired result (i.e., no permutation matrix $P$ exists such that $PA$ is a permutation of the columns of $A$ for some $A$).

The argument symmetric for $AP$.

parsiad
  • 25,154
  • Didnt't I find such $P$ for the $2\times 2$ case? $P'$ permutes the columns of $A$. – giannisl9 Feb 14 '19 at 04:02
  • I think I just wrote the claim poorly. Edited. – parsiad Feb 14 '19 at 04:05
  • Thanks for your answer! Since $ad-bc$ should not equal 0, it is clear even for the general $2\times 2$ case that condition(s) apply so that $P'$ exists. The think is that if it does such that $P'A$ is a permutation of $A$'s columns, $AP'$ is not a permutation of $A$'s rows. This is what I am trying to say. – giannisl9 Feb 14 '19 at 05:04
0

This is a sketch of the answer to question 2.

Let $\mathcal M$ be the space of $n \times n$ matrices. We may write a generic element $M \in \mathcal M$ as

$$ \newcommand \mb { \begin{bmatrix} } \newcommand \me { \end{bmatrix} } \newcommand \md { \vdots & \ddots & \vdots \\ } \newcommand \mr [1] { m_{#1,1} & \dots & m_{#1,n} \\ } M = \mb \mr 1 \md \mr n \me $$

Let $X = \{ 1 \dots n \}$, and let $\alpha, \beta : X \to X$ be any two permutations. We may represent them as linear transformations $\alpha, \beta : \mathcal M \to \mathcal M$, where $\alpha$ permutes the rows of its argument, and $\beta$ permutes the columns of its argument. That is,

$$ \newcommand \rp [1] { m_{\alpha(#1), 1} & \dots & m_{\alpha(#1), n} \\ } \newcommand \cp [1] { m_{#1, \beta(1)} & \dots & m_{#1, \beta(n)} \\ } \alpha(M) = \mb \rp 1 \md \rp n \me, \qquad \beta (M) = \mb \cp 1 \md \cp n \me $$

We want to find the matrices $M \in \ker {(\alpha - \beta)}$. Working elementwise, we want $m_{\alpha(i), j} = m_{i, \beta(j)}$ for every $i, j \in X$. This is a linear homogeneous system of $n \times n$ equations on $n \times n$ variables.

Note that the entries of $M$ are indexed by $X^2$. Define the equivalence relation $\sim_{\alpha, \beta}$ on $X^2$, where $(i,j) \sim_{\alpha, \beta} (k,l)$ if and only if the equation $m_{ij} = m_{kl}$ can be deduced from the system. Let $S = X^2 / \sim_{\alpha, \beta}$ be the set of equivalence classes of $\sim_{\alpha, \beta}$. Then, $M \in \ker {(\alpha - \beta)}$ if and only if there exist scalars $\{ w_s \}_ {s \in S}$ such that $m_{ij} = w_{\overline {(i,j)}}$ for every $i, j \in X$.

Actually computing $\sim_{\alpha, \beta}$ is a combinatorial problem, outside of my field of expertise.

isekaijin
  • 1,765