0

I came across an answer of this (which is the highest voted, and also awarded bounties worth 50 reputations). To quote, this is what the answer said:-

"Suppose this was not the case, i.e. $0.9999... \neq 1$. Then $0.9999... < 1$ (I hope we agree on that). But between two distinct real numbers, there's always another one (say $x$) in between, hence $0.9999... < x < 1$.

The decimal representation of $x$ must have a digit somewhere that is not $9$ (otherwise $x = 0.9999...$). But that means it's actually smaller, $x < 0.9999...$, contradicting the definition of $x$.

Thus, the assumption that there's a number between $0.9999...$ and $1$ is false, hence they're equal."

I have a problem regarding this, it merely tells that there exists no $x$ such that $$0.999...<x<1$$ and thus reaches the conclusion that $$0.9999...=x$$ What am I missing?

Bill Dubuque
  • 272,048

1 Answers1

4

There is a problem with this proof, but it's not where you're looking for it. Its logic is supposed to be: assume both numbers are different, then, there must be a number strictly between them ("...between two distinct real numbers, there's always another one..."), and that leads to a contradiction. The way the contradiction is reached is faulty, though: "The decimal representation of $x$ must have ..." is implicitly assuming that there is a unique decimal representation of $x$, while proving that $0.9999\ldots=1.0000\ldots$, i.e. that decimal representation is not always unique. A proof proving its own invalidity is a rare specimen.

NoNames
  • 61