2

In a derivation I encountered the following problem: Let $\mathbf{U}$ be an orthogonal matrix and $\mathbf{D}$ be a diagonal matrix with pairwise different, strictly positive elements, both of dimension $n$. An orthogonal similarity transformation $\mathbf{U^T} \mathbf{D} \mathbf{U} = \mathbf{M}$ turns $\mathbf{D}$ into a matrix $\mathbf{M}$ which has diagonal elements that are identical to each other (but is not necessarily a diagonal matrix [actually there would be no solution for diagonal $\mathbf{M}$]).

How can I determine an orthogonal matrix $\mathbf{U}$ which fulfills this condition?

Is it always possible to find such a matrix $\mathbf{U}$, regardless of the dimension $n$ and regardless of the choice of diagonal elements in $\mathbf{D}$?

There is a related question, but it only concerns $2 \times 2$ matrices: Is there a similarity transformation rendering all diagonal elements of a matrix equal?

Any ideas on this one? Thank you!

Ralf
  • 303
  • Are you allowing $U$ to be the identity matrix? What about if $U$ is a permutation matrix? – user759562 Mar 22 '20 at 10:33
  • If $\mathbf{U}$ would be an identity matrix, $\mathbf{M}$ would be identical to $\mathbf{D}$, so its diagonal elements would not be identical. If $\mathbf{U}$ would be a permutation matrix, $\mathbf{M}$ would be diagonal and contain the same elements as $\mathbf{D}$, but in permuted order, so again the diagonal elements are not identical (see https://math.stackexchange.com/q/3362592/702757). By the way, $\mathbf{M}$ can't be diagonal since a similarity transformation preserves the eigenvalues and therefore the diagonal elements on diagonal matrices. – Ralf Mar 22 '20 at 11:02
  • Ah, I see, there may be a misunderstanding: All diagonal elements of $\mathbf{M}$ should be identical to each other, not to the elements of $\mathbf{D}$. I'll edit the question accordingly. – Ralf Mar 22 '20 at 11:21
  • what's the field here, $\mathbb R$? You can do this over reals though it takes a little work and is indirect. As is often the case, if you instead work over $\mathbb C$ and instead of orthogonal use the term unitary $\mathbf U$, so $\mathbf U^* \mathbf D \mathbf U =\mathbf M$, there is an extremely short and nice proof. – user8675309 Mar 22 '20 at 21:00
  • @user8675309: The field is the real numbers, but I would be very much interested to see the proof for the complex case. Could you be so kind and post it as an answer, thanks a lot! If you even know how to do it over reals, and find the time to post it as well, that would be even better. Looking forward to see your answer. – Ralf Mar 23 '20 at 08:56

1 Answers1

1

in the complex/unitary case, consider setting
$\mathbf U := \mathbf F$

where $\mathbf F$ is the Discrete Fourier Transform matrix, which is unitary. (The conventions vary -- sometimes people write $\frac{1}{\sqrt{n}}\mathbf F$ as being unitary.) Then $\mathbf {FDF}^*$ is a circulant matrix and has constant elements on the diagonal.

in particular check, using associativity and outerproduct interpretation of matrix multiplication:
$\mathbf {FDF}^* =\big(\mathbf {FD}\big)\mathbf F^* = \sum_{j=1}^n \lambda_j\cdot \mathbf f_j \mathbf f_j^* $

and the diagonal of each $\mathbf f_j \mathbf f_j^*$ is constant. Using the Hadamard product this is written as :
$\mathbf I\circ\big(\mathbf f_j \mathbf f_j^*\big)=\frac{1}{n}\mathbf I$

which gives the result.

user8675309
  • 10,034
  • Thanks a lot! I'll need some time to fully understand this. Just one question: The $\lambda_j$ is not meant to be an eigenvalue, but just a diagonal element of $\mathbf{D}$, right? – Ralf Mar 24 '20 at 06:52
  • In the case of a diagonal matrix, they are the same. I could have written $d_{j,j}$ instead. – user8675309 Mar 24 '20 at 08:35
  • I tested some products $\mathbf{f}_j\mathbf{f}_j^*$ from the examples given at https://en.wikipedia.org/wiki/DFT_matrix and it seems to work: the diagonal of the (complex) outer product has the same elements. How did you find out about this property? – Ralf Mar 24 '20 at 10:15
  • Just for others who might be interested: $\mathbf{F}{jk} = \exp{2\pi i (j-1)(k-1)/n}$ for $j,k=1,\ldots,n$ and $i^2=-1$, so $\mathbf{F}{jk} \mathbf{F}_{jk}^* = \exp{2\pi i (j-1)(k-1)/n} \exp{-2\pi i (j-1)(k-1)/n} = \exp{0}= 1$, thus $\operatorname{diag}{\mathbf{f}_j\mathbf{f}_j^*} = \mathbf{1}$. – Ralf Mar 24 '20 at 11:24
  • I was too slow for an edit: I omitted the factor $1/\sqrt{n}$ in the equations above. – Ralf Mar 24 '20 at 11:31
  • I just noticed that for the real case, Hadamard matrices (with a factor $1/n$) fulfill the desired property. However, they are not available for all matrix sizes and apparently it is not even clear for which sizes they exist (https://en.wikipedia.org/wiki/Hadamard_matrix). I also wonder whether the condition $\operatorname{diag}{\mathbf{u}_j\mathbf{u}_j^T} = \mathbf{1}$ is always required to fulfill the desired property or whether the property could be obtained only after the addition of all outer vector products. – Ralf Mar 24 '20 at 12:15
  • right-- ignoring rescaling by a positive constant, the 2x2 Hadamard matrix $H_2$ is a DFT. Then for dimensions of $2^m$ x $2^m$ you can easily get the result in reals via Kronecker products of $H_2$. The underlying idea is that the eigenvalues of a Hermitian matrix majorize the diagonal of said matrix, then (assuming no periodicity issues) passing a limit on the doubly stochastic matrix that relates the two, and then working backwards. None of that is needed per se -- the circulant matrix is a very common and useful matrix, so you could also get to this result by pattern recognition. – user8675309 Mar 24 '20 at 19:07
  • I get the part on the Hadamard matrices, but could I kindly ask you to elaborate on the rest of your comment ("The underlying idea ... pattern recognition.")? This remained opaque to me. Maybe add a separate answer? Thanks a lot! – Ralf Mar 25 '20 at 06:23
  • Just a correction of my own comment: I think with the Hadamard matrices, the factor should be $1/\sqrt{n}$. – Ralf May 04 '20 at 13:14