2

I came across an interesting problem in a linear algebra problem book. This is the very first paragraph in the problem book, it deals with basic operations with matrices: multiplication by a number, addition, matrix multiplication, transposition. I solved all the previous problems without much trouble, but I can't do anything with this problem.

Problem. Let $X$ be a matrix of a general form. Is it always possible to find matrices $A$ and $B$ such that $AXB=X^T$?

I used two approaches:

  1. Using a general formula for matrix multiplication. This is a complete nightmare. Double sums come out - a dead end.

  2. Using some specific matrices as $X$. Matrix units seem to be a good option, they are very easy to multiply. But for any matrix unit there will always be $A$ and $B$ such that $AXB=X^T$. Dead end again.

I've seen the questions Is it possible to transpose a square matrix by multiplication? and Can you transpose a matrix using matrix multiplication? but the answers there are not really relevant to my question, I think.

Nick Sm
  • 23
  • If $X$ is a matrix over a number field, then $X$ is similar to $X^{T}$. Thus you can find a matrix $P$ such that $P^{-1}XP=X^{T}$. – fusheng Jun 18 '23 at 02:34
  • I wondering if you can exploit elementary matrices where right multiplication corresponds to column operations and left multiplication corresponds to row operation? That being said all you would need is to track your row/ column operations that takes you from $X$ to $X^T$. (I guess here I am assuming $X$ is square) – An Isomorphic Teen Jun 18 '23 at 02:42
  • @fusheng, matrix similarity is defined only for square matrices. The problem is about all matrices. – Nick Sm Jun 18 '23 at 02:48
  • 2
    @AnIsomorphicTeen, no, X doesn't have to be square. Actually, there is a problem later in the book in the paragraph about elementary matrices: Can the operation of transposing a general form matrix be reduced to elementary transformations of its rows and columns? The answer is: Not always. Hint: see problem number 1.36.2 (the problem in my starting post). :D – Nick Sm Jun 18 '23 at 02:56
  • Are you asking for a specific $X$ or if the transposition operator is representable as above? – copper.hat Jun 18 '23 at 13:22
  • @copper.hat, I don't understand your question, to be honest. I suppose that the intended solution is to find a counterexample somehow, because there is nothing to grasp on to if one wants to draw deductions from the general formula of matrix multiplication. – Nick Sm Jun 18 '23 at 20:28
  • @NickSm Is $X$ a fixed matrix or are you looking for a formula that holds for all $X$ (the latter does not exist for $n>1$)? – copper.hat Jun 18 '23 at 20:31
  • @copper.hat, I'm trying to prove that this formula does not hold for all X. – Nick Sm Jun 18 '23 at 20:39
  • @NickSm Then I do not understand your comment to my answer. – copper.hat Jun 18 '23 at 20:40
  • @copper.hat, maybe ''matrix of a general form'' is confusing? I don't like this wording. The problem can be reformulated as follows. Is it true that for all real matrices $X$ there exist matrices $A$ and $B$ such that $AXB=X^T$. – Nick Sm Jun 18 '23 at 22:44
  • @NickSm I see, I thought you were looking for $A,B$ such that $AXB = X^T$ for all $X$. – copper.hat Jun 18 '23 at 23:53

3 Answers3

3

The question is given a real $X$, are there $A,B$ such that $AXB= X^T$?

If $X$ is square then https://math.stackexchange.com/a/94617/27978 shows that $X,X^T$ are similar. This is the tricky part, the rest is straightforward.

If $AXB = X^T$ then $B^T X^T A^T = X$, hence we can assume that $n>m$ (more rows than columns).

Using elementary row operations, we may write $PX = \begin{bmatrix} U \\ 0 \end{bmatrix}$ for some invertible $P$ and $U$ is square & upper triangular. There is some $V$ such that $V U V^{-1} = U^T$.

Then $\begin{bmatrix} V & 0 \end{bmatrix} PX \begin{bmatrix} V^{-1} & 0\end{bmatrix} = \begin{bmatrix} V & 0 \end{bmatrix} \begin{bmatrix} U \\ 0 \end{bmatrix} \begin{bmatrix} V^{-1} & 0\end{bmatrix} = \begin{bmatrix} U^T & 0 \end{bmatrix} = X^T P^T$.

Choosing $A=\begin{bmatrix} V & 0 \end{bmatrix} P$ and $B=\begin{bmatrix} V^{-1} & 0\end{bmatrix} P^{-T}$ gives the desired result.


Original answer (I thought the question was to determine if $A,B$ exist so that $AXB = X^T$ for all $X$.)

It is vacuously true for $1 \times 1$ matrices :-). So suppose we are dealing with $n \times m$ where either $m$ or $n$ are greater than one.

Choose $X= e_i e_j^T$. If we compute the $ab$ entry of $Ae_i e_j^TB=e_j e_i^T$ we get $A_{ai} B_{jb} = \delta_{aj} \delta_{bi}$.

Choosing $a=j, b=i$ gives $A_{ji} B_{ji} = 1$ and so all elements of $A,B$ are non zero. Now suppose either $a \neq j$ or $b \neq i$ (at least one of these possibilities exists) we get $A_{ai} B_{jb} = 0$, a contradiction.

copper.hat
  • 172,524
  • 1
    If X is a matrix unit, then it is always possible to find A and B (one switch of rows and one switch of columns). I don't think that ''$A_{ji}B_{ji}=1$ and so all elements of $A,B$ are non zero'' is true. – Nick Sm Jun 18 '23 at 08:21
  • Oh, I'm wrong big times. This problem messes up my head, sorry. This argument with switching rows and columns can only be used for squared matrices. But still I don't understand how can ''$A_{ji}B_{ji}=1$ and so all elements of A,B are non zero'' be true. – Nick Sm Jun 18 '23 at 22:48
  • @NickSm I updated my answer for a specific $X$. – copper.hat Jun 19 '23 at 00:53
  • copper hat, thank you very much for this answer. Although this is too advanced for me (for the moment, I hope). I noted that this is the very beginning of the problem book, so the reader is not required to have knowledge beyond high school. I understand the idea of your answer, but the proof of some details is not yet available to me (for example, why the matrix $P$ will always be invertible). I guess, authors of the book screwed up and put this problem in the wrong place. – Nick Sm Jun 21 '23 at 12:05
  • @NickSm One can show that a matrix $A$ is similar to its transpose, that is, there exists an invertible $V$ such that $VA = A^TV$. However, it is not a result that typically appears in a linear algebra textbook. – copper.hat Jun 21 '23 at 16:27
1

If $X$ is a matrix over a field, then using elementary row- and column operations, $X$ can be diagonalized. That is, there exist invertible square matrices $L$ and $R$ such that $$L X R = D.$$ Now proceed by showing that a diagonal maxtrix can be transposed as desired. Finally reverse the row- and column operations (switching the roles of row and column) to obtain the transpose of $X$ itself.

WimC
  • 32,192
  • 2
  • 48
  • 88
  • X in the problem does not have to be square. – Nick Sm Jun 18 '23 at 20:14
  • @NickSm That is also not assumed here. Diagonal means $D_{ij}=0$ if $i \neq j$. – WimC Jun 19 '23 at 04:18
  • WimC, sorry, I didn't know it. I checked three different books before writing my first comment, and in each of them only the square matrix was called diagonal. – Nick Sm Jun 21 '23 at 12:08
  • @NickSm Yeah, terminology is not always consistent. At least wikipedia seems to agree with the general case. :-) – WimC Jun 21 '23 at 13:58
1

$\def\eqdef{\stackrel{\text{def}}{=}}$ Here are details of the procedure described by WimC in his answer. When he refers to $\ X\ $ being "diagonalized", I believe he was referring to obtaining a matrix of the form of the right hand side of the first equation below, not necessarily square.

Let $\ X\ $ be an $\ n\times m\ $ matrix of rank $\ r\,(\le\min(n,m))\ $. Then, by performing reduced row reduction on $\ X\ $ and permuting the columns of its row reduced form, we can find an invertible $\ n\times n\ $ matrix $\ E\ $ and an $\ m\times m\ $ permutation matrix $\ P\ $ such that $$ EXP=\pmatrix{I_{r\times r}&0_{r\times(m-r)}\\ 0_{(n-r)\times r}&0_{(n-r)\times(m-r)}}\ , $$ whence $$ X^T=P\pmatrix{I_{r\times r}&0_{r\times(n-r)}\\ 0_{(m-r)\times r}&0_{(m-r)\times(n-r)}}\big(E^T)^{-1}\ ,\hspace{1em}\tag{1}\label{eq1} $$ since $\ P^T=P^{-1}\ $. But, putting $\ \nu\eqdef n-r, \mu\eqdef m-r\ $ for brevity, \begin{align} &\pmatrix{I_{r\times r}&0_{r\times\nu}\\ 0_{\mu\times r}&0_{\mu\times\nu}}\pmatrix{I_{r\times r}&0_{r\times\mu}\\ 0_{\nu\times r}&0_{\nu\times\mu}}\pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\\ =&\pmatrix{I_{r\times r}&0_{r\times\mu}\\ 0_{\mu\times r}&0_{\mu\times \mu}}\pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\\ =&\pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\ , \end{align} which is the matrix in the centre of the right hand side of equation \eqref{eq1}. Thus, now putting \begin{align} N&\eqdef\pmatrix{I_{r\times r}&0_{r\times\nu}\\ 0_{\mu\times r}&0_{\mu\times\nu}}\\ M&\eqdef\pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\ \end{align} and substituting the expression $\ N\pmatrix{I_{r\times r}&0_{r\times\mu}\\ 0_{\nu\times r}&0_{\nu\times\mu}}M\ $ for the matrix $\ \pmatrix{I_{r\times r}&0_{r\times \nu}\\0_{\mu\times r}&0_{\mu\times\nu}}\ $ into equation \eqref{eq1}, we finally have \begin{align} X^T&=PN\pmatrix{I_{r\times r}&0_{r\times\mu}\\ 0_{\nu\times r}&0_{\nu\times\mu}}M\big(E^T)^{-1}\\ &=PN(EXP)M\big(E^T)^{-1}\\ &=AXB\ , \end{align} where $\ A\eqdef PNE\ $ , and $\ B\eqdef PM\big(E^T)^{-1}\ $.

lonza leggiera
  • 28,646
  • 2
  • 12
  • 33