4

Consider a block-cipher $F$ with a $N$-bit block-size and a $M$-bit key-size. That is, if $k$ is a $M$-bit key, $p$ is a $N$-bit block of plain-text and $c$ is a $N$-bit block of cipher-text, then: $$ F\left(k,p\right) = c $$

Each plain-text block of $N$ bits must have a 1-to-1 correspondence with a cipher-text block of $N$ bits.

My question is: is there a theoretical maximum number of distinct keys such that in the set of such keys, there are no two keys $k_1$ and $k_2$ for which $F\left( k_1, p \right) = F\left( k_2,p\right)$ for all values of $p$?

For instance, if all we could do was a bit-permutation and an exclusive-OR with masks derived from the key, then $M \le N + \lfloor\log_2 N!\rfloor$ could be such a maximum. But given for instance the 'MixColumns' and 'SubBytes' steps in AES, many more distinct keys would be possible. But is there a theoretical upper limit?

CodesInChaos
  • 24,841
  • 2
  • 89
  • 128
Avijit
  • 143
  • 2

2 Answers2

4

If you are asking about a truly theoretical limit for a theoretical block cipher (as opposed to a practically usable one, like AES), you could calculate the number of possible keys like this:

A block cipher, together with a key $k$ ($|k|$ = $M$), describes one of many possible random permutations: For every plaintext block, there is exactly one ciphertext block. The number of possible blocks is simply $2^M$, where $M$ is the length of a block in bits.

The number of permutations of $n$ different objects is $n!$, so for our (very unpractical) block cipher, there can be at most $2^M!$ keys. Many of those will be unusable in practice (for some, plaintext and ciphertext will differ for only a single block), but they all satisfy your condition that they differ in at least one plaintext-ciphertext-pair from all other permutations.

Now, specifying one of all possible permutations without any optimization would take an insane number of bits to transfer (the number of plaintext-ciphertext-pairs in the permutation, times the number of permutations: $2^M\cdot2^M!$

To be able to actually transmit the key with fewer bits than any conceivable message, we limit our keyspace to mere $N$ bits, or $2^N$ possible permutations.

If you insert the actual numbers, we get $2^{256}$ possible keys for AES-256, which seems like a lot (and for practical purposes, it is), but the theoretical limit for 128-bit blocks is much, much higher: $2^{128}\cdot2^{128}!$.

Update: Avijit has pointed out in the comments that there is an easy optimization which allows enumerating all possible permutations with "only" $\lceil log_2 2^M!\rceil$ bits (which allows for $2^M!$ keys). This is a lot more compact, but still much larger than any practical key size.

lxgr
  • 1,798
  • 1
  • 13
  • 22
  • 2
    Hi @lxgr -- I'm with you as far as understanding that there are $2^M!$ distinct permutations, and therefore $2^M!$ keys as well. But when it comes to representing those keys, seems to me that $\lceil \log_2 2^M! \rceil$ bits should be sufficient -- permutations can be lexicographically ordered, and the $n^{th}$ permutation can be deduced quickly using the Factorial Number System. Of course, this is still an insanely large number. – Avijit Sep 11 '13 at 05:22
  • @Avijit Very interesting suggestion and link, thanks! I've updated my answer accordingly. – lxgr Sep 11 '13 at 21:57
-4

Since there are only $2^N$ unique output blocks, an $N$ bit key is the maximum useful size, assuming the block cipher is a perfect 1 to 1 permutation with all distinct keys giving distinct ciphertext block for a given plaintext block.

Since that is not proven, the maximum key size is $2^{2N}$ in practice, this is because of the possibility of equivalent keys with a key size matching the block size, which is generally accepted (i think) as $2^{N-1}$ unique outputs for $2^N$ keys for a well made cipher.

Note that this is only for 1 plaintext block, 2 keys that give the same ciphertext for one block may not have the same behavior for any other plaintext block for a given cipher. For this reason a maximum key size of $2^N * 2^N$ or $2^{2N}$ is logical.

In some ciphers (including AES) the round count is also determined by the key size, making other attacks more difficult, to match or exceed the workload of a brute force attack on the key space. Different ciphers use different building blocks that are dissimilar, but in all instances a maximum key size of double the block size makes the most sense mathematically, with a minimum key size of the block size.

Try to break it down and think of a cipher with 2-bit blocks and keys, then extend that to 4-bit keys.

Richie Frame
  • 13,097
  • 1
  • 25
  • 42
  • 2
    This is wrong; the submitter is well aware that if keys are greater than $N$ bits, then there must be plaintext $P$ and keys $k_1$ and $k_2$ with $E_{k_1}(P) = E_{k_2}(P)$. The submitter went on to specify that $E_{k_1}(P) = E_{k_2}(P)$ for all values of $P$; in general, this may not be possible for two distinct keys of size only $2^{2N}$. – poncho Sep 10 '13 at 19:42
  • Read the questions I linked. For example there are 26! keys for an alphabetic substitution cipher, but only 26 different letters. – CodesInChaos Sep 10 '13 at 20:07
  • I am taking "maximum useful" to mean something different than "maximum functional", based on a security perspective, the maximum functional key size would not be useful. I am designing a toy cipher to give a better explanation – Richie Frame Sep 11 '13 at 06:19
  • Why do you think "all distinct keys giving distinct ciphertext block for a given plaintext block" is a useful criteria for limiting the key size? Because that makes brute-forcing possible with only one plaintext-ciphertext pair? For 2-bit blocks there are 4! = 24 different possible permutations. This number increases quite quickly ... for 5-bit blocks, there are already $32! > 2^{117}$ different permutations. Why do you deem only $2^{10}$ of them as useful? – Paŭlo Ebermann Sep 11 '13 at 19:42
  • I would not really consider a 5-bit block cipher as useful, unlike a 64-bit block cipher, which would have a maximum functional keysize of a million terabytes.. which is NOT useful. We are also limited by the amount of subkeys in the key schedule, and more subkeys = more rounds, which would mean the cipher would spend hours/days/years encrypting a single block, which is also NOT useful. Somehow i skipped over the word "theoretical" in the original question. – Richie Frame Sep 12 '13 at 02:48
  • Made a mistake in that last comment, its actually around a billion terabits for 64-bit block, which is 131 TiB. And yes I do concede my original answer was dumb, I would delete it but a historical record of my stupidity and how badly I read the question is a good idea. I also found a few interesting things during testing.... – Richie Frame Sep 13 '13 at 06:25