0

How can I prove that $\frac{1+\sqrt{3}}{2}$ is not algebraic integer in $\mathbb{Z}$?

I understand algebraic integer in a commutative ring $R$ as any element $r\in R$ which satisfies equation $P(r)=0$ where $P$ is a nontrivial polynomial whose coefficients are multiplies of $1_{R}$ and the top degree coefficient is $1_{R}$.

4 Answers4

2

HINT. I assume you meant algebraic integer. Find the minimal polynomial for $\dfrac{1+\sqrt{3}}{2}$. If you are having trouble with this, follow the idea from this answer. Now examine that minimal polynomial. Why does this being the minimal polynomial imply that $\dfrac{1+\sqrt{3}}{2}$ is not an (algebraic) integer?

If you meant actual integer, then this is much easier. If you take two integers, then their product is an integer. And if you sum/difference two integers, you get an integer. How do you use these to show that $\dfrac{1+\sqrt{3}}{2}$ is an integer if and only if $\sqrt{3}$ is an integer? Now show that $\sqrt{3}$ is not even rational (nevertheless an integer!).

  • 1
    Though it is easy to show $\sqrt{3}$ is not rational, it is even easier to show it is not an integer: $1^2<3<2^2$ therefore $1<\sqrt{3}<2$. – 79037662 Oct 23 '19 at 18:41
1

If $\alpha$ is an algebraic integer then so too is $\alpha' = 1-\alpha$ hence so too is $\alpha\alpha' = -1/2,\,$ contradiction.

Remark $ $ More conceptually let's recall one motivation for the definition of algebraic integers. Suppose that we desire to consider as "integers" some subring $\:\mathbb I\:$ of the field of all algebraic numbers. To be a purely algebraic notion, it cannot distinguish between conjugate roots, so if $\rm\:\alpha,\alpha'$ are roots of the same polynomial irreducible over $\rm\:\mathbb Q,\:$ then $\rm\:\alpha\in\mathbb I\iff \alpha'\in\mathbb I.\:$ Also we desire $\rm\:\mathbb I\cap \mathbb Q = \mathbb Z\ $ so that our notion of algebraic integer is a faithful extension of the notion of a rational integer. Now suppose that $\rm\:f(x)\:$ is the monic minimal polynomial over $\rm\:\mathbb Q\:$ of an algebraic "integer" $\rm\:\alpha\in \mathbb I.\:$ Then $\rm\:f(x) = (x-\alpha)\:(x-\alpha')\:(x-\alpha'')\:\cdots\:$ has coefficients in $\rm\:\mathbb I\cap \mathbb Q = \mathbb Z.\:$ Therefore the monic minimal polynomial of elements $\in\mathbb I\:$ must have coefficients $\in\mathbb Z.\,$ In particular a quadratic irrational $\,\alpha\in\Bbb I\iff (x\!-\!\alpha)(x\!-\!\alpha') = x^2\!-(\alpha\!+\!\alpha') x + \alpha\alpha'\in\Bbb Z[x],\,$ i.e. iff $\alpha$ has trace and norm $\in \Bbb Z,\,$ which fails in the OP since $\,\alpha\alpha' = -1/2$.

Conversely, one easily shows that the set of all such algebraic numbers contains $1$ and is closed under both difference and multiplication, so it forms a ring.

Hence a few natural hypotheses on the notion of an algebraic integer imply the standard criterion in terms of minimal polynomials.

Bill Dubuque
  • 272,048
0

Hint:

Find a polynomial equation satisfied by this number, and apply the rational roots theorem.

Bernard
  • 175,478
0

I would be inclined to prove it is not rational (and so cannot be an integer) using the "rational root theorem". The rational root theorem say that "if $\frac{m}{n}$ is a root of the polynomial equation $a_nx^n+ a_{n-1}x^{n-1}+ \cdot\cdot\cdot+ a_1x+ a_0= 0$, with all coefficients integers, then the denominator, n, must divide the leading coefficient, $a_n$, and the numerator, m, must divide the constant term, $a_0$".

If $x= \frac{1+ \sqrt{3}}{2}$ then it satisfies the equation $\left(x-\frac{1+ \sqrt{3}}{2}\right)\left(x+ \frac{1- \sqrt{3}}{2}\right)=$$ \left(-\frac{1}{2}- \frac{\sqrt{3}}{2}\right)\left(x-\frac{1}{2}+ \frac{\sqrt{3}}{2}\right)$$= \left(x- \frac{1}{2}\right)^2- \frac{3}{4}= x^2- x- \frac{1}{2}= 0$. Multiplying by 2, that is equivalent to $2x^2- 2x- 1= 0$.

So any rational roots of that must be of the form $\frac{m}{n}$ where m divides -1 (so can only be 1 or -1) and n divides 2 (so can only be 1, -1, 2, and -2). That is the only possible rational roots of that equation are 1, -1, 1/2, or -1/2. The only integer roots are 1 and -1. Since $\frac{1+ \sqrt{3}}{2}$ is not equal to either 1 or -1, it is not an integer.

user247327
  • 18,710