3

By "iterative solution" I mean specifically the following type of iteration: given a problem whose solution is $x$, first you compute some approximate solution $x_n$, and then make use of $x_n$ to find $x_{n+1}$; and furthermore the sequence $\{x_n\}$ converges to $x$.

If $x_{n+1} = x_{n} + f(n)$ for a known $f$, then clearly $x_{n+1} = \sum_0^n{f(n)}$, and thus the computation of $x_n$ requires merely a knowledge of $f(n)$ and would not be "iterative" according to my definition.

Consider as an example the equation $x^2 - 2 = 0$. Newton's method applied to this equation is "iterative", but there also exists an infinite series for $\sqrt{2}$.

Question: Is there a problem for which it can be proven rigorously that the only possible solution is "iterative"?

(Apologies for not defining "iterative" more rigorously, any suggestions for a better definition would be appreciated)

Matt Calhoun
  • 4,404

2 Answers2

2

Ultimately, I don't think "iterative" is a well-defined concept. As you note, if we have $x_{n+1}=x_n+f(n)$ then the limit is $x_0+\sum_{i=0}^{\infty}f(i)$, which, even if computing $f(i)$ would be much easier given $x_i$ than not, is certainly a well-defined infinite series - and ultimately, distinguishing between an infinite sum that yields a given value and the limit of a sequence of partial sums is not meaningful, since the former is typically taken to be the latter.

To be sure, typically, one defines every irrational number as the limit of some sequence, and the irrational cannot be obtained any other way - this is because we can easily define the rationals, and then consider that irrational numbers are just limits of Cauchy sequences of rationals (i.e. the reals are the completion of the rationals), so you could say that every irrational number is the result of an iterative process.

It's also somewhat unclear what it means to be a "solution". Like, we can talk about the algebraic properties of $x=\sqrt{2}$ without knowing where on the number line it falls because it is an algebraic number. Like, we know that $x^2=2$ and $x^{-1}=\frac{x}2$ and things like that even without knowing that $\sqrt{2}$ is somewhere in $(1,2)$ - and entire branches of mathematics, like Galois theory work by treating $\sqrt{2}$ as if it were similar to extending the reals by $i$. Even without having any idea where $x=\cos(x)$ is solved, one can make statements like, for that $x$, it must be that $\sin(x)^2=1-x^2$, where we just consider the answer as a hypothetical number satisfying a given property and discuss what is true of it.

Finally, it is often possible to answer the question "Is $x$ in the interval $[a,b]$?" without invoking limits - like, if $a^2<2$ and $b^2>2$, then $\sqrt{2}$ is definitely somewhere in between. This, unlike an infinite series of limits, where we do not intrinsically know how close a partial sum is to the answer, gives a tad more certainty about the value of $x$ than any technique involving infinitely many terms.

Milo Brandt
  • 60,888
  • To clarify what is meant to be "a solution", consider as an example the problem "Compute the solution to the equation $y=cos(y)$ that is accurate to 6 decimal places of it's actual value $x$", then clearly the word "solution" has it's normal meaning, namely: $x$ solves the equation. Suppose you wanted to increase the accuracy of this computation, could you do so without invoking an "iterative" process that depends on knowing the previous values of the sequence? – Matt Calhoun Oct 04 '14 at 17:34
  • As I note in my last paragraph, no iterative method is necessary for that problem (and likely never for such a problem, unless we have some definitely convergent series for which we can prove nothing about speed of convergence); it's equivalent to: Find an integer $a$ such that $x$ is in $[10^{-6}a,10^{-6}(a+1)]$ and using the intermediate value theorem, it is trivial to find such an $a$ without the use of any limits. (i.e. just look until $x-\cos(x)$ changes sign in some interval) – Milo Brandt Oct 04 '14 at 17:37
  • When you say no iterative method is neccessary, do you mean a method other than the Bisection method as described here? – Matt Calhoun Oct 04 '14 at 17:42
  • More or less; I mean that, using bisection, you can, in a finite number of steps and subject to certain "sanity" conditions on $f$, always answer the question: "Is there a root $x$ of $f$ in $[a,b]$?" It seems like this is the only meaningful question when it comes to placing an irrational number on the number line, but it requires no infinite methods like fixed point iteration or Newton's method - to some degree, the whole proof is $f$ is continuous and $sgn(f(a))=-sgn(f(b))$ therefore there is a root in between, where we're just using bisection to initially find $a$ and $b$. – Milo Brandt Oct 04 '14 at 17:50
  • Sorry for the confusion. I have been studying numerical methods, where the problem is to construct a sequence of intervals which converges on the solution of an equation such as $x=cos(x)$. I consider the limit of this sequence to be "the solution". It's possible to write down such a sequence for $\sqrt{2}$ in a "closed form", whereas apparently the solution of $x=cos(x)$ has "no closed form solution". The problem of defining closed form solution was raised here, and motivates my question. – Matt Calhoun Oct 04 '14 at 22:00
1

How about the solution of $x=\cos(x)$? It is easy to prove that a fixed point (the root) exists, but the only way I can think of getting to it is by iteration.

user_of_math
  • 4,192
  • Me too! My question is specifically: is it possible to prove the statement "the only way to get it is by iteration" rigorously? – Matt Calhoun Oct 04 '14 at 17:00
  • @MattCalhoun Unlike $\sqrt{2}$, the root of $x=\cos(x)$ doesn't quite look like an algebraic number, does it? I don't know for sure, but I suspect the root of $x=\cos(x)$ is transcendental. – user_of_math Oct 04 '14 at 17:10
  • Just iterating $\cos$ does it, since it's an attractive fixed point, but "iteration" could also mean Newton's method, which I would expect to converge much faster. – Michael Hardy Oct 04 '14 at 18:06
  • Isn't the evaluation of a powerseries on argument $x$ also inherently an iterative process by increasing the powers of $x$? – Gottfried Helms Oct 05 '14 at 10:06