1

I am having trouble figuring out how long it takes to decrypt a 64-bit key, given that a computer can do 1 billion trials per second. I know that there are $2^{64} = 1.844 \times 10^{19}$ possible keys for this key size, but then how do I go about figuring out the rest? If there is a formula for this, I would be very curious to know what it is.

kelalaka
  • 48,443
  • 11
  • 116
  • 196
  • 3
    You know the number of possible keys, and you know how many keys per second the computer can attempt. Can you not figure out the average/worst case times? Hint: you know K and K/s, and you need to figure out s – Marc Sep 09 '20 at 20:48
  • @Marc I think maybe I'm really overthinking this and I just need to divide the number of keys by 1 billion? – BedfordFalls1947 Sep 09 '20 at 21:30
  • For the worst case, yes. I'll let you work out the average case, don't overthink it. – Marc Sep 09 '20 at 21:37
  • @Marc I really have no idea how to work out the average case. I've never done this before. Is there a general rule as to how that is supposed to be done? Like, if it could do 1 trillion, then the worst case would be dividing the number by 1 trillion. How would I figure out the average case in that situation? – BedfordFalls1947 Sep 09 '20 at 21:42
  • The number of keys per second does not change. In the best case, the first key you try is the correct one. In the worst case, it's the last one. Now what is the average time? – Marc Sep 09 '20 at 21:50
  • @Marc You get it after trying half of them? – BedfordFalls1947 Sep 09 '20 at 22:07
  • 1
    Math tip: this is much easier if you stick to powers of two and use the laws of exponents. A billion is $2^{30}$ plus change. So trying $2^{64}$ keys at $2^{30}$ keys/second is just a division. And since we've expressed everything as powers of two, we can do it by subtracting exponents: $2^{64} ÷ 2^{30} = 2^{64 - 30} = 2^{34}$ seconds, which is $2^{4 + 30} = 2^4 \times 2^{30} =$ about 16 billion seconds and change in the worst case. Average is half that. – Luis Casillas Sep 09 '20 at 22:28
  • @LuisCasillas thank you. That is very helpful! – BedfordFalls1947 Sep 09 '20 at 22:34
  • "If there are $30$ apples and each person has $3$ apples, how many people are there?" turns into "if I have $2^{64}$ keys and I test $1,000,000$ keys per second, how many seconds do I need?" It's what, third grade math? Then the average is half. – Serpent27 Sep 09 '20 at 23:17
  • The average is half because if I have $2^{64}$ numbers, $50%$ of the numbers are less than $\frac{2^{64}}{2}$ and $50%$ are more than $\frac{2^{64}}{2}$. That means there's a $50%$ chance I'll get it in less tries and a $50%$ chance I'll get it in more tries. – Serpent27 Sep 09 '20 at 23:34
  • 1
    I'm surprised no one pointed out that we don't decrypt keys, we decrypt a ciphertext by trying all possible keys. The method is known as brute-force. – tum_ Sep 10 '20 at 06:59

1 Answers1

0

If you measure the number of keys you can guess per second (as suggested in the comments) you can calculate from there. As a frame of reference, if you can calculate 1,000,000,000,000 (1 trillion) keys per second, the time to break a 64-bit key would we an average of 15.25 weeks (106.75 days), with the longest possible time to break being 30.5 weeks (213.5 days).

You need to benchmark the number of keys per second on your own system to know the time it would take for you. If you can calculate half the keys, it takes twice as long. If your system calculates 10 trillion keys/sec it would take only $\frac1{10}$ the time (10.675 days average, 21.35 days max). I think you get the picture.

Serpent27
  • 1,461
  • 5
  • 11