I'm given a problem to factorize $$ P(x)=x^5+x+1 $$
I've done the following:
$$ P(x)=(x^5+x^4+x^3)-(x^4+x^3+x^2)+(x^2+x+1)= (x^2+x+1)(x^3-x^2+1)$$
Is it possible to prove that this cannot be factorized any further?
I'm given a problem to factorize $$ P(x)=x^5+x+1 $$
I've done the following:
$$ P(x)=(x^5+x^4+x^3)-(x^4+x^3+x^2)+(x^2+x+1)= (x^2+x+1)(x^3-x^2+1)$$
Is it possible to prove that this cannot be factorized any further?
It depends what you are factoring over. Notice that if we use the quadratic formula, $x^2 + x + 1$ has roots $-\frac{1}{2} \pm \frac{i\sqrt{3}}{2}$, so it can be factorized as
$$(x^2 + x + 1) = (x - (-\frac{1}{2} + \frac{i\sqrt{3}}{2}))(x - (-\frac{1}{2} - \frac{i\sqrt{3}}{2}))$$
If you are factoring over just the real numbers, finding the roots shows that this polynomial (at least the $x^2+x+1$ portion) cannot be reduced further.
The $x^3-x^2+1$ portion can be factored further over the reals because it has one real root, but it is not further reducible over just the rational numbers.
So the point at which a polynomial has no further factors depends on the set over which you are factoring.