0

I've been experimenting with MBEDTLS on an ARM-M4 (but this question is not platform-specific) and am confused/surprised regarding the computational complexity (execution time) of mbedtls_rsa_pkcs1_sign() versus mbedtls_rsa_pkcs1_verify(). Apparently the _sign is much more computationally complex than the _verify. I don't understand this and would expect the opposite to be true.

Observations:

  • mbedtls_rsa_pkcs1_decrypt() or mbedtls_rsa_pkcs1_sign() < 3 secs
  • mbedtls_rsa_pkcs1_encrypt() or mbedtls_rsa_pkcs1_verify() < 100 millisecs (this is for RSA-2048 / SHA-256)

I would have assumed that the verify should be similar in complexity to the decrypt (because it requires decryption of the signature hash) and that the sign would be similar in complexity to encrypt (because it requires encryption of the hash). I understand that for the signature operations the Private and Public key use is reversed, but I'm not sure how/why that affects things.

kelalaka
  • 48,443
  • 11
  • 116
  • 196

1 Answers1

1

Apparently the _sign is much more computationally complex than the _verify.

For RSA, this is typically the case.

For the signature verification operation, the computationally expensive part is the computation of $M^e \bmod N$ (for some $M$); where $e$ is a public value, which is typically fairly small (65537 is the most common value).

For the signature genearation operation, the computationally expensive part is the computation of $M^d \bmod N$ (for some $M$); where $d$ is a private value that's close to the size of $N$.

The computation of $M^x \bmod N$ takes time roughly proportional to the log of $x$; and so we're talking about of log2 of 16 (for signature verification) versus a log2 of 2048 (for signature generation), and so by this simple-minded analysis, we'd expect that verification be about a 100 times faster.

Now, during signature generation, an implementation can use the "Chinese Remainder Theorem" to cut this time by maybe a factor of 4 (this optimization cannot be used during the signature verification, as it requires knowledge of the factorization of $N$); this would reduce the expected ratio to about 25, which is roughly what you see.

For RSA, $d$ must be large (small values of $d$ are known to cause weaknesses); on the other hand, we don't mind if the attacker can guess $e$ (we'll publish it in the public key, hence he doesn't have to guess), and so there is no drawback in making is small. Actually, an $e=3$ can actually be secure; however some people don't like it as it can be fragile if you don't get the padding right.

poncho
  • 147,019
  • 11
  • 229
  • 360