19

Let $M,N$ be $n \times n$ square matrices over an algebraically closed field with the properties that the trace of both matrices coincides along with all powers of the matrix. More specifically, suppose that $\mathrm{Tr}(M^k) = \mathrm{Tr}(N^k)$ for all $1\leq k \leq n$. The following questions about eigenvalues is then natural and I was thinking it would be an application of Cayley-Hamilton but I am having trouble writing out a proof.

How do we show that $M$ and $N$ have the same eigenvalues?

Added (because this question is now target of many duplicates, it should state its hypotheses properly). Assume that all the mentioned values of $k$ are nonzero in the field considered; in other words either the field is of characteristic $0$, or else its prime characteristic $p$ satisfies $p>n$.

PinkyWay
  • 4,565
user7980
  • 3,073
  • I am suspicious about this result... because I realize that it means it is also true for diagonal matrices with integer entries and thus gives a quite astonishing result for number theory. – Patrick Da Silva Dec 05 '11 at 02:03
  • 8
    The result is false over arbitrary fields: if the field has characteristic $p\gt 0$, then take $M$ to be the $p\times p$ zero matrix, and $N$ to be the $p\times p$ identity matrix. So presumably, you are working over $\mathbb{R}$ or $\mathbb{C}$ or some other restriction? – Arturo Magidin Dec 05 '11 at 02:03
  • Yes sorry I forgot to give conditions on the base field: we can assume the matrices are over an algebraically close field. – user7980 Dec 05 '11 at 02:32
  • 3
    @user7980: The examples I give still hold over an algebraically closed field of positive characteristic. You need to assume more than that: as Splice note, you need either $n$ to be smaller than the characteristic, or for the characteristic to be zero. – Arturo Magidin Dec 05 '11 at 02:58
  • @Patrick: What result is it you allude to? – Arturo Magidin Dec 05 '11 at 03:30
  • I am not suspicious anymore after reading Splice's answer, it is quite convincing. I meant that this result implies things such as if $$ \forall k, 1 \le k \le n, \quad \sum_{i=1}^n a_i^k = \sum_{i=1}^n b_i^k \quad \Longrightarrow a_i = b_i, \quad i=1, \dots, n. $$ because the eigenvalues of a diagonal matrix with the $a_i$'s and the $b_i$'s on it are precisely the $a_i$'s, and this theorem can be proved as a special case of this question. It seems non-trivial if $n$ is large and we restrict our attention to integers. – Patrick Da Silva Dec 05 '11 at 03:33
  • 4
    @PatrickDaSilva If you work over integers/reals and all eigenvalues are real, there is a very simple argument to prove this result: If $|a_1| \leq |a_2| \leq ... \leq |a_n|$ then $\lim_k \frac{\sum_{i=1}^n a_i^{2k}}{a_n^{2k}}=\lim_k \frac{\sum_{i=1}^n b_i^{2k}}{a_n^{2k}}$. Left side counts the multiplicity of $a_n^2$ as an eigenvalue for $A^2$ while right side is finite and nonzero if and only $b_n^2=a_n^2$. Then you get the multiplicity of $b_2^2$ is the same... You eliminate these and repeat... Then it is easy to take care of signs.... – N. S. Dec 05 '11 at 07:16
  • Hm. This is what I wasn't able to understand from the question posted in user1551's answer, you explained it pretty well. Thanks a lot! – Patrick Da Silva Dec 05 '11 at 07:25

2 Answers2

27

$M$ and $N$ have the same generalized eigenvalues (with multiplicity) if and only if their characteristic polynomials are the same. Thus it suffices to show that the power sums $T_k = \sum _{i=1}^{n} \alpha^k_i$ for $k = 1$ to $n$ generate the ring of symmetric polynomials (the coefficients of the characteristic polynomial are symmetric polynomials in the eigenvalues). This is a result due to Newton. For example, if $S_k$ is the usual $k$-th elementary symmetric polynomial (the sum of all products of $k$ distinct $\alpha_i$), then one has:

$$S_1 = \sum \alpha_i = T_1,$$

$$S_2 = \sum_{i > j} \alpha_i \alpha_j = \frac{1}{2} \left( \left(\sum \alpha_i\right)^2 - \sum \alpha^2_i \right) = \frac{1}{2}(T^2_1 - T_2)$$

More generally, one has:

$$\log \sum_{k=0}^{n} S_k X^k = \log \prod_{i=1}^{n} (1 + \alpha_i X) = \sum_{k=1}^{n} \log(1 + \alpha_i x)$$

which, expanding the logarithm, becomes:

$$ \sum_{k=1}^{n} \sum_{j=1}^{\infty} \frac{\alpha^j_i (-1)^{j-1} X^j}{j} = \sum_{j=1}^{\infty} \frac{T_j (-1)^{j-1} X^j}{j}$$

In particular, from the $S_k$ one can determine all the $T_k$, and from the $T_k$ (for $k = 1$ to $n$) one can determine the $S_k$ (and hence all the $T_k$ as well).

This even shows that the ring generated by $T_k$ over any ring $R$ for $k = 1$ to $n$ is the same as the ring generated by $S_k$ for $k = 1$ to $n$, as long as $n!$ is invertible in $R$. So the result also holds for any field of characteristic $p > n$. It is false if $p \le n$ - for example, the identity $p \times p$ matrix and all its powers has trace $0$, which is the same as the zero matrix.

user1551
  • 139,064
Splice
  • 391
3

What about this one?

We know that if $\psi_A(x) = |xI_n - A| = x^n+c_{n-1}x^{n-1}+c_{n-2}x^{n-2}+\cdots+c_1x+c_0I_n$ be characteristic polynomial of $A$ (thus $c_0 = (-1)^n|A|$), then the coefficients are given by $$c_{n-m}=\frac{(-1)^m}{m}\left| \begin{array}{ccccc} t_1 & m-1 & 0 & \cdots & 0 \\ t_2 & t_1 & m-2 & \cdots & 0 \\ \vdots & \vdots & & & \vdots \\ t_{m-1} & t_{m-2} & t_2 & t_1 & 1 \\ t_m & t_{m-1} & t_3 & t_2 & t_1 \\ \end{array} \right|$$ where $t_r:= \operatorname{tr}(A^r)$.

Assume that $\psi_{M}(x)=x^n+a_{n-1}x^{n-1}+a_{n-2}x^{n-2}+\cdots+a_1x+a_0I_n$ and $\psi_N(x)=x^n+b_{n-1}x^{n-1}+b_{n-2}x^{n-2}+\cdots+b_1x+b_0I_n$. Then $$a_r=b_r$$ (by using $\operatorname{tr}(M^k) = \operatorname{tr}(N^k)$ and the above determinant).

Hence $\psi_M(x)=\psi_N(x)$ which means $M$ and $N$ will have same eigenvalues.

KON3
  • 4,111