This just feels wrong. My professor is claiming that .999... = 1 with the following proof:
$x = .999\dots$
$10x = 9.999\dots$
$9x = 10x - x = 9.999\dots = 9$
$x = 1$
Is this actually correct and I'm just going crazy?
This just feels wrong. My professor is claiming that .999... = 1 with the following proof:
$x = .999\dots$
$10x = 9.999\dots$
$9x = 10x - x = 9.999\dots = 9$
$x = 1$
Is this actually correct and I'm just going crazy?
Two real numbers are equal, if their difference is 0. The difference between a and b is 0 if $\forall \epsilon>0 \, |a-b|<\epsilon $. This has to do with how real numbers are constructed. There are several ways. One is e.g. that a real-number is a cauchy sequence of rational numbers. Then, two real-numbers $a_n, b_n$ so defined, are equal if the sequence $c_n = a_n - b_n$ has the limit $0$. Decimal numbers are just one way to represent a real number, and as your example suggests, for some numbers there is more than one decimal representation.
Edit: The proof looks good. The second-last line should read $9x = 10x - x = 9.99999... - 0.99999... = 9$
0.99999… is a string of symbols only.
0.99999… is a limit of the series 0.9 ; 0.99;0.999;0.9999 etc …The limit is obviously 1.
It is a so called “actual infinity” of decimal nines following a leading 0. This needs to be defined in turn. It is not by definition the same as the answer 2 (but most persons believe so). There is a problem is related to the implications for the general concept of actual infinity.