12

Edit: According to the suggestions in comments, this only holds when f(x) is monic.

Let $p$ be a prime number and $f(x) = \sum_{i=0}^{p} a_{i}x^{i}$ be a monic polynomial with $a_{0}, a_{1}, \ldots, a_{p} \in\mathbb{Z}$. Let $\alpha_{1}, \alpha_{2}, \ldots, \alpha_{p}$ be its roots.

I am trying to show that the polynomial $$g(x) = \frac{x^{p} - \alpha_{1}^{p}}{x - \alpha_{1}} \cdot \frac{x^{p} - \alpha_{2}^{p}}{x - \alpha_{2}} \cdots \frac{x^{p} - \alpha_{p}^{p}}{x - \alpha_{p}}$$ has integer coefficients.

My attempt: I managed to show that the polynomial $$h(x) = (x^{p} - \alpha_{1}^{p})\cdot (x^{p} - \alpha_{2}^{p})\cdots (x^{p} - \alpha_{p}^{p})$$ is a polynomial with integer coefficients. This is because coefficient of $f(x)$ are values of elementary symmetric polynomial in $p$ variables, evaluated at the roots $\alpha_{1}, \alpha_{2}, \ldots, \alpha_{p}$ and coefficient of $h(x)$ are values of same elementary symmetric polynomial evaluated at $\alpha_{1}^{p}, \ldots, \alpha_{p}^{p}$. Then I used Fundamental Theorem of Symmetric Polynomials.

My strategy was to show that when I divide $h(x)$ by $(x-\alpha_{1})\cdots (x-\alpha_{p})$, I get integer polynomial as a result. I thought of doing this partially because it might help me find the coefficents of $g(x)$ too. But now, I am struggling to show that.

Are there any other strategy to this problem (preferably that can help me isolate the coefficients of $g(x)$)?

Maulik
  • 396
  • 3
    You need $f(x)$ to be monic (or satisfy other constraints). Otherwise, it fails at $p$ as low as $2$ (counterexample: $f(x) = 2x^2+3x+2$) – achille hui Sep 20 '21 at 22:19
  • ...and if $a_p=1$ the conclusion holds true regardless of $p$ being a prime or not. – dxiv Sep 20 '21 at 23:17

3 Answers3

9

For $n\ge3,$ let $f(x)\in \mathbb Z[x]$ be monic of degree $n.$ The roots of $f(x)$ are $\alpha_1, \alpha_2, \ldots, \alpha_n.$ Let $$g(x) = \frac{x^n - \alpha_1^n}{x - \alpha_1} \cdots \frac{x^n - \alpha_n^n}{x - \alpha_n}$$

$$g(x)=\prod_{k=1}^n (x^{n-1}+\alpha_kx^{n-2}+\alpha_k^2x^{n-3}+\cdots +\alpha_k^{n-1} )\tag1$$ By (1), g(x) is unchanged whenever the roots are permuted. Thus, each coefficient of $g(x)$ is a symmetric function of the roots of $f(x)$ (with coefficients in $\mathbb Z$). By the well-known theorem, each coefficient of $g(x)$ is therefore a $\mathbb Z$-linear combination of elementary symmetric functions of the roots of $f(x)$. Since the latter are integers, $g(x)\in\mathbb Z[x].$

David
  • 91
  • 1
6

If $\,a_p \ne 1$ the proposition does not hold true in general, as noted in a comment. Otherwise if $\,a_p=1\,$, let $\,\displaystyle P_n(z) = \prod_{k=1}^p(z- \alpha_k^n)\,$, and $\,\displaystyle q(x)=\frac{P_n(x^n)}{P_1(x)}\,$.

$T_n(x)=P_n(x^n)$ is symmetric in $\,\alpha_k\,$, so its coefficients are symmetric functions of the coefficients $\,a_i\,$ and therefore integers themselves. Moreover, $\,T_n(\alpha_i)=P_n(\alpha_i^n)=0\,$, so $\,P_1(x) \mid T_n(x)\,$, and $\,\displaystyle q(x)=\frac{T_n(x)}{P_1(x)}\,$ has integer coefficients as the quotient of two monic polynomials with integer coefficients.

$P_n(z)\,$ can be explicitly determined either by expanding the product and expressing the coefficients in terms of the elementary symmetric polynomials in $\,\alpha_k\,$, or by using polynomial resultants $\,P_n(z) = \text{res}\left(P_1(x), z-x^n, x\right)\,$.

The proposition in OP's question follows for $\,n=p\,$, and it holds true whether $\,p\,$ is a prime or not.

dxiv
  • 76,497
4

Under the assumption that $f(x)$ is a monic polynomial, here's a sketch of how one could go about determining the coefficients of this expansion. It should be evident that the expression whose expansion is requested can be written as a polynomial of degree $D=p^2-p$. We write

$$g(x;\vec{\alpha})=P(x;\alpha_1)...P(x;\alpha_p)=\sum_{n=0}^D\Gamma_n(\vec{\alpha})x^n$$

where $P(x;r)=\sum_{n=0}^{p-1}r^{p-1-n}x^n$

It is easy to see by the definition of $g$ that it is invariant under any permutation $S=\{\sigma{(1)},...,\sigma{(p)}\}$ of the set $\{\alpha_1,..., \alpha_p\}$. We readily conclude that the coefficients have the same symmetry:

$$\Gamma_n(S\vec{\alpha})=\Gamma_n(\vec{\alpha})$$

Also the coefficients can be shown to obey the very important scaling relation

$$\Gamma_n(\mu \vec{\alpha})=\mu^{D-n}\Gamma_n(\vec{\alpha})$$

Now, also by the definition of $g$ and simple power counting one should be able to show that the coefficients are polynomials in multiple variables made up only of positive powers of the variables $\alpha_1,...,\alpha_p$. This along with the scaling relation and the symmetry requirement suffices to show that the coefficients must be symmetric polynomials of total degree $D-n$.

With this proven, one can now invoke the fundamental theorem of multivariable symmetric polynomials, which asserts that no matter the form of a symmetric polynomial in many variables, it always has an expression in terms of elementary symmetric polynomials, and since those can be computed to be integer valued by the assumption that the coefficients of the polynomial with roots $\alpha_1,...,\alpha_p$ are integers. Therefore, we conclude that $\Gamma_n \in \mathbb{Z}$.

It takes quite a bit more work to find explicit expressions for the coefficients defined above. Define a basis of symmetric polynomials of total degree $N$ as follows

$$s_{N}^{q_1,..., q_p}(\alpha_1, \alpha_2,... \alpha_p)=\sum_{\text{sym}}\alpha_1^{q_1}... \alpha_p^{q_p}~~,~~ \sum_{i=1}^p q_i=N$$

I conjecture that the coefficients can be expressed as an equal weight linear combination of the polynomials in the basis defined above as follows:

$$\Gamma_n(\vec{\alpha})=\sum_{r\in R} s^{r}_{D-n}(\vec{\alpha})$$

where $r$ is a $p$-index in the set defined by

$$R=\{(r_1,..., r_p):\sum_{i=1}^p{r_i}=D-n,0\leq r_i\leq p-1 \}$$

These sets of p-indices can be generated in Mathematica by evoking the function

r[p_, n_] :=  Select[IntegerPartitions[p^2 - n, {p}] - Table[Table[1, {i, 1, p}], {j, 1, Length[IntegerPartitions[p^2 - n, {p}]]}], Max[#] <= p - 1 &]

Interestingly enough, this symmetric polynomial basis is also good enough to solve the more general problem with

$$P(x;r)=\sum_{n=0}^{p-1}c_{p-1-n} x^n r^{p-1-n}$$

which admits a solution which is a simple generalization of the previous one

$$\Gamma_n(\vec{\alpha})=\sum_{r\in R} c_{r_1}...c_{r_p}s^{r_1,...,r_p}_{D-n}(\vec{\alpha})$$

Note that the coefficients will still be integral if the $c$'s are integral.

DinosaurEgg
  • 10,775
  • 1
    I absolutely adore this answer.. This is so beautiful.. I had figured out the second part of what you say, at least a little bit of it -- albeit, I like your notation too. It does not seem that anything stops me from making same argument for when the degree of $f(x)$ is not prime, right? Also, I am sorry to everyone that I did not say that $f(x)$ is monic.. I see clearly that monic-ness of $f(x)$ is a requirement, which I accept. – Maulik Sep 21 '21 at 02:07
  • I appreciate the compliment very much, thank you :) As mentioned in other answers, there is no restriction on the degree of the polynomial. This result is purely powered by the theory of symmetric polynomials, it seems. – DinosaurEgg Sep 21 '21 at 02:33
  • Also, I almost forgot to mention it in last comment -- the general formula for coefficients that I had worked out for when degree of $f(x)$ is prime... sometimes the summands in the general formula you have presented comes with a negative sign and sometimes with positive sign.. I haven't guessed a pattern as to when we get $+$ (and $-$ ) signs.. so the sum is a sum but upto a $\pm$ sign on each summand... It seems to be dependent on degree $p$ of $f(x)$ and then on components $r_{1}, r_{2}, \ldots, r_{p}$ of the summands, in some weird way. – Maulik Sep 21 '21 at 02:53
  • I surely cannot see any alternating signs in the formulas I present above, so I would appreciate a clarification. – DinosaurEgg Sep 21 '21 at 02:58
  • The only way alternating signs can appear is when one tries to express the symmetric polynomials defined above in terms of elementary symmetric polynomials. For example, the first couple terms for $D>3$ are $\Gamma_D(\vec{a})=1~,~ \Gamma_{D-1}(\vec{a})=e_1(\vec{a}) ~,~ \Gamma_{D-2}(\vec{a})=e_1^2(\vec{a})-e_2(\vec{a}) ...$ – DinosaurEgg Sep 21 '21 at 03:13
  • Let us see the case when $p = 3$. Say that $f(x) = x^{3} + \sum_{i=0}^{2} a_{i}x^{i}$ with. We will have that the coefficient of $x^{2}$ in $g(x) = f(xw) \times f(xw^{2})$ is given by $a_{0}a_{2}(w + w^{2}) + a_{1}^{2} w^{3} = a_{1}^{2} - a_{0}a_{2}$. I saw this as creating coefficient $a_{0}a_{2}$ corresponding to $0+2 = 2$ and $a_{1}^{2}$ corresponding to $1+1 = 2$. Now that I look at your formula, may be I didn't undesrtand it well enough. I would still appreciate if you explain your formula for $p=3$ and coefficient of $x^{2}$. If you decide not to, I understand.. – Maulik Sep 21 '21 at 03:24
  • My expressions are all in terms of symmetric polynomials of the roots, not the coefficients of your original polynomial $f(x)$. Apologies if the notation confused, but writing $\alpha$ instead of $a$ all the time has been confusing to my stupid brain :) – DinosaurEgg Sep 21 '21 at 03:29
  • ahhh.. I see.. actually, I should have guessed that .. I am so sorry.. your formula makes perfect sense now (modulo some other misunderstandings on my part.) – Maulik Sep 21 '21 at 03:34