In short: why is the determinant of a symmetric matrix an irreducible polynomial in the upper-triangular elements as variables?
The determinant of a square matrix $(x_{ij}) \in \Bbb F^{n \times n}$ is of course a polynomial $p$ of degree $n$ in the $n^2 $ variables $x_{ij}$. This polynomial is irreducible, a fact which is proved nicely here.
When looking at symmetric polynomials, the determinant is a different polynomial $f$ in $\frac{n(n+1)}{2}$ variables, namely $(x_{ij})_{i\leq j}$. Note that $f$ is a quadratic polynomial in each variable, whereas $p$ is linear in each variable.
How do we know that $f$ is an irreducible polynomial too? This post gives a seemingly-simple proof that I don't get:
Suppose by contradiction that $f=gh$. Then we claim that $f=q^2$ for some polynomial $q$ in the variables $(x_{ij})_{i\leq j}$. To justify this, we look at the identity $\det (AA^T)=\det(A)^2$, so we know that $p(x_{ij})^2$ is always equal in value to $f(y_{ij})_{i\leq j}$, where $y_{ij}=\sum \limits_k x_{ik}x_{kj}$. Since the variables $y_{ij}, x_{ij}$ are not the same, I don't understand how we can claim from the identity $p(x_{ij})^2=f(y_{ij})$ that the required $q$ exists, and I also don't see where we use our assumption by way of contradiction that f is reducible.
I arrived at the question in the book by Shafarevich, where he sadly just comments that it is "easy to see":