Being 0.999... = 1, I expect that they have the same behaviour when applying the same algorithm/operation, but:
If we define >, <, =, as checking digit by digit two number, we have that 0 < 1 in the first check and an algorithm will say 0.999... < 1
If we truncate both numbers at any point (let's say 3 decimal digits), we have 0.999 vs 1.
Why I am wrong? Can someone help me to clarify?
Thank you in advance from this junior amateur noob! :)
Please, note that I'm aware of Is it true that $0.999999999\dots=1$? but I wanted to know why truncating and comparison as explained in school are wrong when dealing with 0.9999...