7

As a follow-up to this question, I noticed that the proof used the fact that $p$ and $q$ were "even". Clearly, when replacing factors of $2$ with factors of $3$ everything does not simply come down to being "even" or "odd", so how could I go about proving that $\sqrt{3}$ is irrational?

  • The square of a multiple of $3$ is a multiple of $3$. The square of something that is either $1$ more or $1$ less than a multiple of $3$ is $1$ more than a multiple of $3$. ${}\qquad{}$ – Michael Hardy Aug 10 '15 at 00:15
  • see http://math.stackexchange.com/questions/1310014/what-is-the-most-rigorous-proof-of-the-irrationality-of-the-square-root-of-3/1310234 – Mark Joshi Aug 10 '15 at 00:16

6 Answers6

8

It's very simple, actually. Assume that $\sqrt{3}$ = $\frac{p}{q}$, with $p,q$ coprime integers.

Then, $p = \sqrt{3}q$ and $p^2 = 3q^2$. If $3\mid p^2$, then $3\mid p$. So actually, $9\mid p^2$. Then, by similar logic, $3\mid q^2$, meaning $3\mid q$. Since $3$ divides both $p$ and $q$, the two numbers are not coprime. This is a contradiction, since we assumed that they $\textbf{were}$ coprime. Therefore, $\sqrt{3}$ cannot be written as a ratio of coprime integers and must be irrational.


$\textbf{NOTE:}$ The word "even" in the original proof was just a substitution for "divisible by $2$". This same idea of divisibility was used in this proof to show that $p$ and $q$ were divisible by $3$. It really is the same idea. There just isn't a nice concise word like "even" that was used to describe a multiple of $3$ in this proof.

  • But it seems like this argument only works for prime numbers, really. Like if we had $\sqrt 4$ we can't say $4|p^2\implies 4|p$. so do we just prove that the square root of every prime is irrational and that the square root of any other integer is the product of roots of primes? – pancini Aug 10 '15 at 00:20
  • 3
    @ElliotG The proof can be modified to show that $\sqrt{n}$ is irrational unless $n=k^2$ is a square. If $n$ is not a perfect square then it has a prime factor $r^m$ with $m$ odd so if $\sqrt{n} = \frac{p}{q}$ with $(p,q)=1$ then $q^2n = p^2$. The right hand side must be divisible by an even power of $r$ while the left hand side is divisible by an odd power of $r$ contradiction. – Winther Aug 10 '15 at 00:30
  • This argument works for all numbers that are not perfect squares. For example, $6$, which definitely is not prime could still work. That is why the proof works: it shows irrationality for the square roots of all numbers that are not perfect squares themselves. –  Aug 10 '15 at 00:31
  • wow that argument is almost simpler than the specific case $n=2$. also not sure why i momentarily thought $\sqrt 4$ was irrational. thanks. – pancini Aug 10 '15 at 00:39
  • @ElliotG hahaha happens to the best of us :P –  Aug 10 '15 at 00:40
  • 1
    How about 'threeven'? =P – Vandermonde Aug 10 '15 at 00:57
  • @Vandermonde ahaha yes. "Let T be the set of threeven integers..." –  Aug 10 '15 at 01:58
5

Alternatively, a contradiction can be derived as follows: $$\begin{align} \sqrt3 &=\frac ab \\ a^2&=3b^2 \\ a^2+b^2 &= 4b^2=(2b)^2 \\ \end{align}$$The contradiction is due to the fact that the integer length of the hypothenuse of a primitive right triangle is odd.

4

Here is a proof that if $n$ is a positive integer that is not a square of an integer, then $\sqrt{n}$ is irrational. This proof does not use any divisibility properties.

Let $k$ be such that $k^2 < n < (k+1)^2$. Suppose $\sqrt{n}$ is rational. Then there is a smallest positive integer $q$ such that $\sqrt{n} = p/q$.

Then $\sqrt{n} = \sqrt{n}\frac{\sqrt{n}-k}{\sqrt{n}-k} = \frac{n-k\sqrt{n}}{\sqrt{n}-k} = \frac{n-kp/q}{p/q-k} = \frac{nq-kp}{p-kq} $.

Since $k < \sqrt{n} < k+1$, $k < p/q < k+1$, or $kq < p < (k+1)q$, so $0 < p-kq < q$. We have thus found a representation of $\sqrt{n}$ with a smaller denominator, which contradicts the specification of $q$.

Note: This is certainly not original - but I had fun working it out based on the proof I know that $\sqrt{2}$ is irrational.

marty cohen
  • 107,799
2

Here's another way: suppose $\sqrt3 = \frac pq$ for some coprime $p,q$, so $$3q^2=p^2.$$ Now reduce this modulo $4$, noting that the quadratic residues modulo $4$ are $0$ and $1$. The only solution is $p^2 \equiv q^2 \equiv 0 \pmod 4$, but then $p$ and $q$ are both even, a contradiction.

Théophile
  • 24,627
0

Proof by contradiction

Assume that $\sqrt{3}$ is rational . Then this means that $\sqrt{3} = \frac{p}{q}$ where $p$ and $q \neq 0$ are integers in their lowest terms. That is to say that $\gcd(p,q) =1$

Now square both sides to get $3 = \frac{p^2}{q^2} \implies 3q^2 = p^2$

Now clearly, $3 \mid p^2$ but since $3$ is prime then $3 \mid p$.

This is because of euclid's lemma that states that if $p$ is prime and $p$ divides $ab$ then $p$ divides $a$ or $p$ divides $b$. And so here we have $3 \mid pp = p^2$ and so $3 \mid p$

Now $3k = p$ for some integer $k$ right and so $3q^2 = (3k)^2$ and so $3q^2 = 3(3k^2)$ and so $q^2 = 3k^2$ and so $3 \mid q^2$, however $3$ is prime so $3 \mid q$ and so now we have $3 \mid q$ and $3 \mid p$ which contradicts the fact that $\gcd(p,q)=1$ and so our assumption that $\sqrt{3}$ is rational is false and hence it must be irrational

alkabary
  • 6,214
0

The general case : If $a,b,c,d$ are positive integers with $GCD(a.b)=1=GCD(c,d)$ then $(a/b)^{c/d}$ is irrational unless $a$ and $b$ are $d$-th powers of positive integers.PROOF: Suppose $(a/b)^{c/d}=e/f$ where $f,g$ are positive integers and $GCD(f,g)=1$. Then $a^cf^d=b^ce^d$. We now apply a result (which one book called the "fundamental theorem of arithmetic"), that if $x,y,z$ are non-zero integers where $x$ divides $yz$ and $GCD(x.y)=1$ then $x$ divides $z$. Firstly, this implies that $GCD(a^c,b^c)=1=GCD(f^d,e^d)$. Secondly, applying the theorem with $x=a^c,y=b^c,z=e^d$, we have: $a^c$ divides $e^d$,but applying it with $x=e^d,y=f^d,z=a^c$, we have: $e^d$ divides $a^c$. So $a^c=e^d$.Now since $a^c.f^d=b^c.e^d$ and $a^c=e^d$, we also have $f^d=b^c$.Finally another consequence of the above theorem is that if $x$ is both a $c$-th power and a $d$-th power with $.GCD(c.d)=1$, then $x$ is a $cd$-th power. Applying this with $x=a^c$,and also with $x=b^c$, we know there are positive integers $u,v$ where $a^c=u^{cd}$ and $.b^c=v^{cd}$. So $a=u^d$ and $b=v^d$. QED.