0

In modern Western culture we think of numbers typically in base 10. By "we" I mean the vast majority of the public. For some reason this is easy, and I'm not sure why or how to pinpoint it down to some concrete reason for its success. The only thing I can think of is we have 10 fingers so we can count on them, and 10 divides pretty "easily" whatever that vague term means.

But we can think of extremely large numbers down to a very low level of precision using this system. For instance, you can think of the number $171,129,412$, which goes down to describing "two" at the scale of 100 million+. But it's as if in your head you can visualize every element in that set, even though you really aren't, but you get the sense that you can understand its size.

You start to lose track a little if the number is too big though: $12,918,274,182,183,182,874,371,173$. It's pretty much impossible now to know how big this is down to the $173$ scale at the end. I have no idea why but it just seems this way.

But you can get into the millions an billions down to the scale of 1, using our numbers represented in base 10 system.

We can use the base 10 system differently and do scientific notation to create orders of magnitude with it as well, leading to thinking of huge numbers at a broader scale / level of detail. Like $10^{30}$ is pretty large, but you can get a sense of it's relative size to $10^2$.

We can also use the base 10 system for decimals to a good degree, where we can go 9 or 10 decimal places in and still feel we are understanding the value's accuracy.

Given all this, I'm wondering if there are any other systems that work equally as well, or close to as well, or even better, when compared to the base 10 system.

Some common base x systems are base 2 for binary, base 16 for hex, base 60 in the past seems like it was popular. However, I personally can't easily look at one of these numbers and get its value down to an accuracy of 1 right away. TBH, I don't get any sense of value of it all all. I'm not sure if I could with practice, or there is something inherent about 10 that makes it simple and intuitive.

My question is, if there is anything other than base 10 that lends itself to easily understanding large and small numbers at any scales, in the same or similar way that the base 10 system does (as described above). In particular, wondering about base 2, base 3, base 4, base 5, ... base 16-ish, and then any arbitrary number after that that lends itself to understanding would be interesting. But mainly wondering if any of these smaller numbers have good properties for understanding like base 10.

One system that is interesting to consider is Roman numerals. Instead of going $1, 2, 3, 4, \cdots$, where it's 1-digit numbers, followed by 2-digit, followed by 3 digit, ..., you instead have $I, II, III, IV, V, \cdots$, so that was $1, 2, 3, 2, 1, \cdots$. So it's an interesting way to represent it while still keeping it base 10 or whatever base. This fits into the category of things I'm asking about in some way.

Lance
  • 3,698
  • Some common base x systems are base 2 for binary, base 16 for hex, base 60 in the past seems like it was popular.

    This is just because you're used to base 10. There is some sort of an optimal base for human memory (base 1,000,000 is too big, because nobody can remember that many symbols, and base 2 is too small, because nobody has trouble remembering more than 2 symbols, so it must be somewhere in between), but it's very unlikely to be 10. In particular, 16 is just as easy to remember, and scales better, so the optimum is at least 16 (sticking to constant-base systems).

    – user3482749 Jan 13 '19 at 16:32
  • Which of these number systems have you been taught in childhood and have used every day of your life since then? That should answer your question about any inherent advantage of decimal. –  Jan 13 '19 at 16:36
  • Base 10 is actually a terrible system in that i makes division by anything but 2 and 5 difficult. A base 12 or 18 would be better although base 16 would make divisions of powers of two probably really easy up to 64 (we can esitimate the rest). For large numbers we need some form of exponentially expandable systems but it doesn't have to necessarily by a digit representation. Roman numerals is good for checkmark counting (which digits aren't) but terrible for any arithmetic and awful for large numbers. Digits would be terrible for counting if humans weren't so excellent at memory. – fleablood Jan 13 '19 at 16:57
  • "However, I personally can't easily look at one of these numbers and get its value down to an accuracy of 1 right away." You could if you had been doing it all your life. There is utterly no significance difference between bases 5 to 25 that a lifetime of use wouldn't obliterate. Mechanically they are identical. And recognizing sacks of 10, 100, 1000, 10000 as opposed to sacks of 7, 49, 343, 2401 is only conditioning. – fleablood Jan 13 '19 at 17:01
  • To add to @fleablood’s point, back in my previous life as a programmer, many of the people I worked with fluently interpreted worked with numbers in hex, _not_mentally translating them to decimal. – amd Jan 14 '19 at 01:47
  • What's fluent anyway. Is 312 in hex 768+16+2 any less intuitively accurate than 786 in base 10. I mean do you really visualize the 3 tens and the 6 ones. If you asked for 786 beans and I gave you 807 instead would you notice? Is it any better to block them into 7 groups of 100 then 8 groups of 10 and have 6 left than to block them into 3 groups of 256 and than one group of 16 and have 2 left. They're completely arbitrary and having a sense of what 100 is [or 96 or 103] is no more natural than having an idea what 256 is. – fleablood Jan 14 '19 at 03:30

0 Answers0