15

Well, "Solving" is the wrong term since I am speaking about irrational numbers. I just don't know which word is the correct word... So that can be part $1$ of my question... what is the correct word since you obviously can't "solve" an irrational number because it goes forever.

Part $2$ (my real question) are there algorithms for figuring out the answer to a problem like the square root of $2$ other than guess-and-checking your way to infinity? Again, I'm obviously not asking for an algorithm to give me the never ending answer because that's crazy... but for example if I wanted to know what the $15^{th}$ decimal place of the square root of $2$ was, is there an algorithm for that?

Thank you! (I'm new here and know nothing about how to format math questions so any help or links would be appreciated as well, thanks!)

5 Answers5

20

You can use newton's method to compute the digits of $\sqrt{(2)}$:
Let: $$ f(x) = x^2 -2 $$ Define the iteration: $$ x_0 = 1\\ x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)} $$ This will converge to $\sqrt{2}$ quadratically.

If you want to compute other square roots:

Consider:
$$g(x) = x^2 - a$$


Which has the iterants: $$ x_{n+1}=\frac{1}{2}\left(x_n+\frac{a}{x_n}\right) $$ As mentioned below.

There's also what's called the continued fraction expansion of an algebraic number. You can use a finite continued fraction expansion.


As an example: $$ x_0 = 1 \\ x_1 = \frac{1}{2}\left(x_0 + \frac{2}{x_0}\right) =\frac{1}{2}\left( \large \textbf{1} + \frac{2}{ \large \mathbf{1}}\right) = \frac{3}{2}\\ x_2 = \frac{1}{2}\left(x_1 + \frac{2}{x_1}\right) = \frac{1}{2}\left( \large \mathbf{\frac{3}{2}} + \frac{2}{ \large \mathbf{\frac{3}{2}}}\right), \text{ etc. } $$

Added

Since we are using Newton's method, and you are wondering why it converges to the root of $f(x)$,

Note the following:
$\textbf{Theorem} $: Suppose that the function $f$ has a zero at $\alpha$, i.e., $f(\alpha) = 0$

If $f$ is continuously differentiable and its derivative is nonzero at $\alpha$, then there exists a neighborhood of $\alpha$ such that for all starting values $x_0$ in that neighborhood, the sequence ${x_n}$ will converge to $\alpha$.

So if we choose our starting guess appropriately, Newton's method always converges to the root of the equation if $f$ has these properties .

Rustyn
  • 8,407
  • In the case of the square root of $a$, this simplifies to $x_{n+1}=\frac{1}{2}\left(x_n+\frac{a}{x_n}\right)$. – André Nicolas Jan 31 '13 at 06:23
  • Are you saying 1+(1-(1^2-2)/(21))+((1-(1^2-2)/(21))-(1^3-2)/(31))+(((1-(1^2-2)/(21))-(1-(1^3-2)/(31)))-(4^2-2)/(41))...

    Sorry it's been a LONG time since I've worked with iterations and what-not I just want to make sure I take the sum of each n value in your defined iteration. (I'm not familiar with Newton's Method)

    – Albert Renshaw Jan 31 '13 at 06:34
  • @AlbertRenshaw No, you don't sum them up. – Rustyn Jan 31 '13 at 06:36
  • What do I do with them? Or are you saying when I reach infinity that alone will be the answer? – Albert Renshaw Jan 31 '13 at 06:36
  • @AlbertRenshaw For example, if $x_0=1$, $x_1 = \frac{1}{2}(1+\frac{2}{1}) = \frac{3}{2}$, $x_2 = \frac{1}{2}(\frac{3}{2}+\frac{2}{1.5})$, etc. You see that if you keep doing this process indefinitely, you will obtain $\sqrt{(2)}$. – Rustyn Jan 31 '13 at 06:36
  • so x_∞=sqrt(2)? – Albert Renshaw Jan 31 '13 at 06:44
  • Can you offer any intuition as to why this happens? Or has no-one logically understood it yet? – Albert Renshaw Jan 31 '13 at 06:45
  • @AlbertRenshaw Yes, you see Newton's method always converges to the root of $f(x) = x^2 - 2$, which happens to be $\sqrt{(2)}$ for an appropriate starting guess. There is a proof for the convergence of newton's method out there somewhere give me a second. – Rustyn Jan 31 '13 at 06:51
  • Very often, the Newton Method is awesomely efficient. Roughly speaking, the number of correct digits doubles with each iteration. Try it for $\sqrt{2}$ on your calculator, starting say at $1.5$. In three iterations, you will get $\sqrt{2}$ to calculator accuracy. – André Nicolas Jan 31 '13 at 08:22
  • 1
    @AlbertRenshaw have a look at the wikipedia page (http://en.wikipedia.org/wiki/Newton's_method). There is an animation that should give you a strong intuition for how it works. – in_mathematica_we_trust Jan 31 '13 at 08:33
  • 1
    I posted a really neat alternative to Newton's method for computing square-roots efficiently at the bottom of this page: http://math.stackexchange.com/questions/287213/fast-square-roots/287453#287453. – Peder Jan 31 '13 at 15:16
  • If we start with the initial value $x_0$=-1 the it will converge to -$\sqrt{2}$. So we have to use bisection method for one or two iterations then we can switch over to newtons method. – Kumar Feb 01 '13 at 06:44
  • @AnjaliVijaya See the theorem, we choose an appropriate neighborhood. – Rustyn Feb 01 '13 at 07:00
13

A related problem. Another way to go is the Taylor series. Derive the Taylor series of the function $\sqrt{x}$ at the point $x=1$

$$ \sqrt{x} = 1+{\frac {1}{2}} \left( x-1 \right) -{\frac {1}{8}} \left( x-1 \right) ^{2}+{\frac {1}{16}} \left( x-1 \right)^{3}-{\frac{5}{128} } \left( x-1 \right)^{4}+O\left( \left( x-1 \right) ^{5} \right). $$

If you plug in $x=2$, you get an approximate value for the $\sqrt{2}\sim 1.398437500$. Increasing the number of terms in the series improves the approximation.

Added: We can write the Taylor series of $\sqrt{x}$ explicitly by finding the $n$th derivative of $\sqrt{x}$ as

$$ \sqrt{x} = \sum _{n=0}^{\infty }\frac{\sqrt {\pi }}{2}\,{\frac {{a}^{\frac{1}{2}-n} \left( x-a\right)^{n}}{\Gamma\left( \frac{3}{2}-n \right)n! }}.$$

Substituting $a=1$ in the above formula gives the Taylor series at the point $a=1$:

$$\sqrt{x} = \sum _{n=0}^{\infty }\frac{\sqrt {\pi }}{2}\,{\frac { \left( x-1\right)^{n}}{\Gamma\left( \frac{3}{2} - n \right)n! }}.$$

Putting $x=2$ in the above equation, we have:

$$\sqrt{2} = \sum _{n=0}^{\infty }\,{\frac {\sqrt{\pi}}{2\,\Gamma\left( \frac{3}{2} - n \right)n! }}. $$

11

You can also compute square roots using continued fractions. For example for $\sqrt{2}$ you have $$ \sqrt{2}=1+(\sqrt{2}-1)=1+\frac{(\sqrt{2}-1)(\sqrt{2}+1)}{\sqrt{2}+1}=1+\frac{1}{\sqrt{2}+1} $$ where $1$ is the integer part of $\sqrt{2}$. Then repeat the process for $\sqrt{2}+1$ whose integer part is $2$: $$ \sqrt{2}+1=2+(\sqrt{2}-1)=2+\frac{(\sqrt{2}-1)(\sqrt{2}+1)}{\sqrt{2}+1}=2+\frac{1}{\sqrt{2}+1} $$ therefore by repeating the process we have $$ \sqrt{2}=1+\frac{1}{2+\frac{1}{\sqrt{2}+1}}=1+\frac{1}{2+\frac{1}{2+\frac{1}{2+\frac{1}{2+\cdots}}}} $$

Peder
  • 2,182
  • 1
    I like this peder, thank you for elaborating on this for him. I mentioned this in my answer but your motivating example deserves a +1. – Rustyn Jan 31 '13 at 19:43
4

Following Rystyn's answer: it is nice to write down the decimals to understand how good the convergence is in Newton's method:

1.000000000000000000000000000000000000000000000000000000000000000000000  
1.500000000000000000000000000000000000000000000000000000000000000000000  
1.416666666666666666666666666666666666666666666666666666666666666666666  
1.414215686274509803921568627450980392156862745098039215686274509803921   
1.414213562374689910626295578890134910116559622115744044584905019200054  
1.414213562373095048801689623502530243614981925776197428498289498623195  
1.414213562373095048801688724209698078569671875377234001561013133113265  
1.414213562373095048801688724209698078569671875376948073176679737990732  
1.414213562373095048801688724209698078569671875376948073176679737990732 
1.414213562373095048801688724209698078569671875376948073176679737990732  
GEdgar
  • 111,679
3

$\newcommand{\+}{^{\dagger}}% \newcommand{\angles}[1]{\left\langle #1 \right\rangle}% \newcommand{\braces}[1]{\left\lbrace #1 \right\rbrace}% \newcommand{\bracks}[1]{\left\lbrack #1 \right\rbrack}% \newcommand{\ceil}[1]{\,\left\lceil #1 \right\rceil\,}% \newcommand{\dd}{{\rm d}}% \newcommand{\down}{\downarrow}% \newcommand{\ds}[1]{\displaystyle{#1}}% \newcommand{\equalby}[1]{{#1 \atop {= \atop \vphantom{\huge A}}}}% \newcommand{\expo}[1]{\,{\rm e}^{#1}\,}% \newcommand{\fermi}{\,{\rm f}}% \newcommand{\floor}[1]{\,\left\lfloor #1 \right\rfloor\,}% \newcommand{\half}{{1 \over 2}}% \newcommand{\ic}{{\rm i}}% \newcommand{\iff}{\Longleftrightarrow} \newcommand{\imp}{\Longrightarrow}% \newcommand{\isdiv}{\,\left.\right\vert\,}% \newcommand{\ket}[1]{\left\vert #1\right\rangle}% \newcommand{\ol}[1]{\overline{#1}}% \newcommand{\pars}[1]{\left( #1 \right)}% \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\pp}{{\cal P}}% \newcommand{\root}[2][]{\,\sqrt[#1]{\,#2\,}\,}% \newcommand{\sech}{\,{\rm sech}}% \newcommand{\sgn}{\,{\rm sgn}}% \newcommand{\totald}[3][]{\frac{{\rm d}^{#1} #2}{{\rm d} #3^{#1}}} \newcommand{\ul}[1]{\underline{#1}}% \newcommand{\verts}[1]{\left\vert\, #1 \,\right\vert}$ It's best to evaluate $\sqrt{2} = 2\sqrt{1\over 2}$ and the guess $3/4$ for $\sqrt{1 \over 2}$: It yields $38$ exact decimal places in $5$ iterations !!!. $$ x_{n + 1} = \half\,\pars{x_{n} + {1 \over 2x_{n}}}\quad\mbox{with}\,\ n \geq 0\,,\quad x_{0} = {3 \over 4}\ \mbox{and}\ \root{2} = 2\lim_{n \to \infty}x_{n} $$

1.500000000000000000000000000000000000000 -> 2.250000000000000000000000000000000000000

1.416666666666666666666666666666666666667 -> 2.006944444444444444444444444444444444444

1.414215686274509803921568627450980392157 -> 2.000006007304882737408688965782391387928

1.414213562374689910626295578890134910117 -> 2.000000000004510950444942772099280764361

1.414213562373095048801689623502530243615 -> 2.000000000000000000000002543584239585437

1.414213562373095048801688724209698078570 -> 2.000000000000000000000000000000000000000

1.414213562373095048801688724209698078570 -> 2.000000000000000000000000000000000000000

1.414213562373095048801688724209698078570 -> 2.000000000000000000000000000000000000000

1.414213562373095048801688724209698078570 -> 2.000000000000000000000000000000000000000

1.414213562373095048801688724209698078570 -> 2.000000000000000000000000000000000000000
Felix Marin
  • 89,464