I've tried to convert 23 to binary and came up with the number 100111
by using the calculation inspired by this answer:
1) Find out the least significant bit:
$$ 23 = \underbrace{(e_n\times 2^n + ... + e_1\times 2^1)}_{22} + 1\times2^0 $$
Which is 1 here. Continue by shifting to the left by dividing by 2:
2) 22/2 = 10 + 1 // next bit is 1
3) 10/2 = 4 + 1 // next bit is 1
4) 4/2 = 2 + 0 // next bit is 0
So I'm left with the 2 in decimal, which is 10 in binary. Now I'm writing down the number:
10 plus the the bits from the operations 4, 3, 2, 1 gives me 100111, however, the answer is 10111. Where is my mistake?
step 5) 2/2 = 0 + 1 // next bit is 1
. – achille hui May 03 '16 at 15:19