9

In short: why is the determinant of a symmetric matrix an irreducible polynomial in the upper-triangular elements as variables?

The determinant of a square matrix $(x_{ij}) \in \Bbb F^{n \times n}$ is of course a polynomial $p$ of degree $n$ in the $n^2 $ variables $x_{ij}$. This polynomial is irreducible, a fact which is proved nicely here.

When looking at symmetric polynomials, the determinant is a different polynomial $f$ in $\frac{n(n+1)}{2}$ variables, namely $(x_{ij})_{i\leq j}$. Note that $f$ is a quadratic polynomial in each variable, whereas $p$ is linear in each variable.

How do we know that $f$ is an irreducible polynomial too? This post gives a seemingly-simple proof that I don't get:

Suppose by contradiction that $f=gh$. Then we claim that $f=q^2$ for some polynomial $q$ in the variables $(x_{ij})_{i\leq j}$. To justify this, we look at the identity $\det (AA^T)=\det(A)^2$, so we know that $p(x_{ij})^2$ is always equal in value to $f(y_{ij})_{i\leq j}$, where $y_{ij}=\sum \limits_k x_{ik}x_{kj}$. Since the variables $y_{ij}, x_{ij}$ are not the same, I don't understand how we can claim from the identity $p(x_{ij})^2=f(y_{ij})$ that the required $q$ exists, and I also don't see where we use our assumption by way of contradiction that f is reducible.

I arrived at the question in the book by Shafarevich, where he sadly just comments that it is "easy to see":enter image description here

Emolga
  • 3,527

3 Answers3

2

This can also be done in a similar way as described in the post linked in the question for general square matrices. Letting $f\in\mathbb F[x_{ij}]_{1\le i\le j\le n}$ be the determinant, suppose that it decomposes as $f=gh$. We need to show that $g$ or $h$ is constant. This follows from the following properties of the determinant (when expressed as a linear combination of monomials in the $x_{ij}$),

  1. For each $i=1,\ldots,n$, $f$ is first order in $x_{ii}$.
  2. For each $1\le i < j\le n$, $f$ contains terms containing $x_{ij}$ but does not contain any terms containing $x_{ii}x_{ij}$ or $x_{jj}x_{ij}$.

I will make use of the following simple statements about factorisation in polynomial rings. (i) If a linear term in $k[x]$ factorizes into a product of two polynomials, then one is linear in $x$ and the other is independent of $x$. (ii) If $f=gh$ for $g\in k[x]\setminus k$ and $h\in k[y]\setminus k$ then $fg$ contains monomial terms containing $xy$.

Now, for the argument that if $f=gh$ then either $g$ or $h$ is constant. By property (1) above, for each $i$, either $g$ is first order in $x_{ii}$ and $h$ is independent of $x_{ii}$, or vice-versa.

Choose any $1\le i < j\le n$ such that $g$ is linear in $x_{ii}$. Then:

  • $h$ is independent of $x_{ii}$.
  • $h$ is also independent of $x_{ij}$ otherwise $gh$ will contain terms containing $x_{ii}x_{ij}$, contradicting (2).
  • $g$ depends on $x_{ij}$ otherwise $gh$ would be independent of $x_{ij}$, contradicting (2).
  • $g$ is linear in $x_{jj}$ otherwise $h$ would be linear in $x_{jj}$ and $gh$ would contain terms containing $x_{jj}x_{ij}$, contradicting (2).

Supposing that $g$ is linear in $x_{11}$ (wlog) the above argument can be applied to each $i < j$ to show that $h$ is independent of all the indeterminates $x_{ii},x_{jj},x_{ij}$ and, hence, is constant.

1

Denote by $\phi$ be the ring endomomorphism ${\mathbb F}[x_{ij}]_{i\leq j} \to {\mathbb F}[x_{ij}]_{i\leq j}$ uniquely defined by $\phi(x_{ij})=y_{ij}$ for $i\leq j$. Then $\phi$ is $\mathbb F$-linear.

I claim that the kernel of $\phi$ is zero. To see why, let $h\in{\mathsf Ker}(\phi)$. We can view $h$ as a function on symmetric $n\times n$ matrices in the obvious way. Then the hypothesis says that $h$ is zero on all matrices of the form $AA^T$. Let $M=(m_{ij})$ be a symmetric, positive definite $(n-1)\times (n-1)$ matrix. Let $v\in{\mathbb R}^{n-1}$, $a\in{\mathbb R}$ and let $M'$ be the $n\times n$ symmetric matrix defined by

$$ M'= \left(\begin{array}{cc} M & v \\ & \\ v^T& a \\ \end{array}\right) $$

There is a threshold $\alpha(M,v)$ such that $M'$ is still positive definite whenever $a>\alpha(M,v)$. For those values of $a$, $M'$ will have a unique positive definite square root, so $h(M')=0$. Since $h$ is a polynomial, we see that (for this value of $M$) $h$ does not depend on $v$ or $a$ and that $h(M,.)=0$. Iterating this argument in smaller dimensions, we eventually obtain $h=0$ as wished.

So the kernel of $\phi$ is zero and $\phi$ is therefore injective.

Write $f$ as a product of irreducibles, $f=f_1f_2\ldots f_r$ (where some of the $f_i$ may be equal, but all the $f_i$ are nonconstant). We then have the identity $p^2=\phi(f_1)\phi(f_2)\ldots \phi(f_r)$. Since all the $\phi(f_i)$ are nonconstant, the only possiblity is $r=2$, $\phi(f_1)=\phi(f_2)=p$ (after rescaling constant factors).

Since $\phi$ is injective, we deduce $f_1=f_2$ and this is the $q$ you're looking for.

Ewan Delanoy
  • 61,600
0

Clearly $\det$ on complex symmetric matrices has no square factors, since $\det$ restricted to diagonal matrices hasn't. Therefore, we have to show that the variety of complex symmetric complex matrices with determinant $0$ is irreducible. But it is the image of the map ( reduction of complex quadratic form to diagonal form) $$M_{n\times n-1}(\mathbb{C}) \to M_{n\times n}(\mathbb{C})\\ A \mapsto A\cdot A^t$$ so being an image of an irreducible variety it is also irreducible.

A similar approach works for the usual determinant. (However, it will not work for the determinant on skew-symmetric matrices)

Note $\det$ restricted to some classes of matrices ( circulant, or group determinants) is reducible. But it should be irreducible on Hankel, Toeplitz, and Hessenberg matrices.

orangeskid
  • 53,909