3

I am a bit confused about the expected running times of brute force attacks on different cryptosystems.

So let's assume a key size of $2^n$ bits.

  1. Symmetric key cryptography:

    $E(brute)$ = $2^{n-1}\cdot$ comparisons time

    So if I understood correctly I need to generate all the possible $2^n$ keys to find the "right" key for sure. Now taken into account the definition of Expected Value of a random variable $X$, that can take the value $x_i$ with $p_i$: $E[X] = \sum p_i \times x_i$. So in our case we can say that $k=2^n$ and $E[X] = \frac{k(k+1)}{2}\cdot \frac{1}{k} = \frac{k+1}{2} \approx 2^{n-1}$.

  2. Cryptographic hashes

    Following the birthday paradox analogy and approximating the required time we end up with $E[X] = \sqrt{2^n} = 2^{n/2}$ comparisons.

My confusion comes from public key cryptography and from ECC: in this post ECC is nicely described, but it says that on average you must guess $2^{n/2}$ keys. I don't really understand where this number comes from as you are not searching for a collision as in the case of hashes, but for an exact match.

Pio
  • 201
  • 2
  • 8
  • 1
    You're confused. Brute force on a $2^{128}$ key takes $2^{128}$ time. Anything else is no longer brute force. – orlp Nov 21 '13 at 15:58
  • I was talking about the expected time of brute force. – Pio Nov 28 '13 at 23:19

1 Answers1

5

Well, ECC takes about $2^{n/2}$ time to break because there are smarter ways to attack it than literally trying each possible key separately.

With AES, the best known-attack is to try a key, and see if it works. If it doesn't, all you've learned is that that specific key wasn't it, only $2^{n}-1$ more to go...

However, with ECC, there are other methods. For ECC, in general, a public key gives us two elliptic curve points $G$ and $P$; to break it, we need to find an integer $k$ such that $kG = P$ (where $kG$ is point multiplication).

Now, there are $q$ possible values of $k$ (where $q \approx 2^{n}$ is a large prime that depends on the curve); however here is a smarter way to attack it:

  • Compute a value $r \approx \sqrt{q}$

  • Generate the $r$ values $P-0G, P-1G, P-2G, P-3G, ..., P-(r-1)G$. This takes $O(r) = O(\sqrt{q}) = O(2^{n/2})$ time.

  • Generate the $r$ values $0rG, 1rG, 2rG, 3rG, ..., (r-1)rG$. This also takes $O(r) = O(\sqrt{q}) = O(2^{n/2})$ time.

  • Scan through the two lists for a value in common; if we see that $P-iG = jrG$, (for two integers $i$, $j$) then we have $P = (jr+i)G$, and that solves it. This always succeeds if $r \ge \sqrt{q}$, and this takes $O(2^{n/2})$ time if we use an appropriate hash table.

Total time taken: $O(2^{n/2})$.

This isn't the only way to solve the problem this quickly (there's also Pollard's Rho, which doesn't involve huge tables), however this is the easiest to explain.

poncho
  • 147,019
  • 11
  • 229
  • 360
  • "With AES, the best known-attack is to try a key, and see if it works." - If I've understood this correctly, I think Dmitry Khovratovich would disagree – Cryptographeur Nov 21 '13 at 15:59
  • I also read, that the order of $G$ must be prime. Why is that? – Pio Nov 21 '13 at 16:04
  • @Pio: it doesn't have to be prime, but if it's not, it makes like easier on the attacker. If the Order of $G$ ($q$ in the above text) is $rs$, then the attacker can solve $k'rG = rP$ and $k''sG = sP$ (total of O($\sqrt{max(r,s)})$ time), and then recombine $k'$ and $k''$ into the original $k$, solving the problem faster than he could if $q$ was prime. – poncho Nov 21 '13 at 16:11
  • Our results just show how to try each key faster, but it is still $O(2^n)$, just the constant inside big-O decreases. @poncho is correct here. – Dmitry Khovratovich Nov 21 '13 at 19:04