31

I am struggling to understand what is meant by "standard cryptographic assumption".

The Wikipedia artice on the Goldwasser–Micali system (GM) reads "GM has the distinction of being the first probabilistic public-key encryption scheme which is provably secure under standard cryptographic assumptions."

But what are standard cryptographic assumptions? I suppose that

  • FACTORING: Given a positive integer $n$, find its prime factorisation.
  • Quadratic Residuosity Problem (QRP): Given an odd composite integer $n$ and $a$ with Jacobi Symbol $(a|n) = +1$ decide whether $a$ is a quadratic residue or a pseudosquare modulo $n$.

are commonly assumed to be difficult and thus "standard cryptographic assumptions". Is why GM is provably secure? But what about the RSA Problem (RSAP)?

Given are a positive integer $n$ that is a product of two distinct odd primes $p$ and $q$, a positive integer $e$ with $\gcd(e, (p-1)(q-1)) = 1$ and an integer $c$. Find an integer $m$ such that $m^e \equiv c \pmod n$.

Is RSAP also commonly supposed to be hard? Is thus RSA provably secure?

Squeamish Ossifrage
  • 48,392
  • 3
  • 116
  • 223
3nondatur
  • 617
  • 6
  • 11
  • 6
    I believe when talking about standard cryptographic assumptions we look at a cryptographic system from the standpoint of the cryptographic standard model. In the cryptographic standard model an adversary (who's aim it is to break a cryptographic system) is limited by time & computational power. – AleksanderCH Nov 07 '19 at 11:50
  • 8
    Well, a "non-standard" assumption is generally whatever assumption the author doesn't particularly like. – Maeher Nov 07 '19 at 11:57
  • 4
    I think the point the author tried to make is more this: The GM system is the first to be provably secure (with sensible assumptions, e.g. not the "my system is secure assumption") in the sense that it was the first probabilistic encryption scheme (RSA is deterministic and therefore not provably CPA / semantically secure) and it was the first because it was literally described in the paper that introduced the security notion and nobody happened to construct a CPA-secure one before "by accident". – SEJPM Nov 07 '19 at 12:41
  • 5
    FYI, the so-called ‘standard model’ is usually a way for academic cryptographers to cast shade on everyone in the real world who uses hash functions (example), by reserving the term ‘standard model’ which sounds modest and uncontroversial for systems that are completely impractical and never used, and then characterizing what everyone actually does with the scary-sounding term ‘random oracle model’ (more details). – Squeamish Ossifrage Nov 07 '19 at 14:25
  • 1
    @SqueamishOssifrage aren't there schemes that are secure in the ROM, but insecure under any instantiation of the oracle? That sounds like an actual issue which is more than just "casting shade". – Mark Schultz-Wu Nov 07 '19 at 21:11
  • 2
    @Mark Only because they're specially crafted to work that way, by using a complexity-theoretic trick to let certain inputs deliberately leak the private key if you instantiate them with a particular function admitting a polynomial-time algorithm to compute it. Nobody would ever use these schemes except as a prank in a paper to cast shade. See the link I gave for more background. What this means is not that there is any ‘actual issue’ but rather that there is something funny with our attempts at formalization! – Squeamish Ossifrage Nov 07 '19 at 21:21
  • 1
    @SqueamishOssifrage Yes, but this funny behavior in our formalism seems noteworthy. Not in the sense that "All ROM schemes are worthless", but in the sense that more care should be given to instantiations of them than of other primitives (which do not have these "funny formalization" issues). – Mark Schultz-Wu Nov 07 '19 at 21:27
  • 1
    I agree that it is noteworthy, which is why I wrote a note about it that I invite anyone in this thread to read! It's not the only issue with cryptography formalisms: collision resistance, precomputation, and cost models also pose issues of formalization, for example. What I wanted to highlight to @AleksanderRas is that ‘standard model’ carries passive-aggressive connotations from academic cryptographers whose public-key cryptosystems never see deployment in the real world which he may not have intended—since the original question here does not seem to be about anything related to the ROM. – Squeamish Ossifrage Nov 07 '19 at 21:34
  • 1
    @AleksanderRas please post answers as answers and not as comments. – Captain Man Nov 07 '19 at 22:01

2 Answers2

34

I am struggling to understand what is meant by "standard cryptographic assumption".

‘Standard assumption’ broadly means an assumption that has withstood the scrutiny of many smart cryptanalysts for a long time. Examples:

  • We think that, for uniform random 1024-bit primes $p$ and $q$, solving $y = x^3 \bmod pq$ for uniform random $x$ is hard given $pq$ and $y$.

    Why? The best way anyone has been able to figure out how to do it is by factoring $n$ to recover $p$ and $q$, and the best known methods—ECM and NFS—cost more than anyone has the budget for.

  • We think that solving $y = \operatorname{AES}_k(x)$ for uniform random $k$ is hard given $x$ and $y$.

    Why? The best way anyone has been able to figure out how to do it is essentially by a generic search, and the best known methods—parallel rainbow tables, parallel distinguished points—cost more than anyone has the budget for.

Half-examples:

  • Pairing-based cryptography. Pairings are highly popular in academic cryptography at conferences but see very little use in the real world, outside exotic cryptocurrency applications. The security story for pairing-friendly elliptic curves is nowhere near as stable as the security story for elliptic curves for more traditional DLOG applications like Diffie–Hellman key agreement and Schnorr signatures.
  • SVP, LWE, and other lattice problems. Lattices are a promising area for post-quantum cryptography, but there is a large design space with a complicated security story that has not stabilized yet—no matter how many completely incomprehensible arguments Chris Peikert and Dan Bernstein have about average-case/worst-case reductions on Twitter.

Non-examples:

  • Conjugacy search in a braid group. Braid cryptography is not widely studied by cryptographers and most existing proposals turn out to be broken.
  • SIMON and SPECK. These block ciphers are popular among spooks trying to muscle their way into international standardization, but it turns out standards bodies are no longer happy to accept the word of spooks without technical justification these days, and also block sizes smaller than 128 bits are foolish.

The Wikipedia artice on the Goldwasser–Micali system (GM) reads "GM has the distinction of being the first probabilistic public-key encryption scheme which is provably secure under standard cryptographic assumptions."

This is an example of academic obfuscation, and evidence that Wikipedia is a terrible resource for learning about cryptography. What it means is:

  • We proved a theorem,1 and the theorem involves pretty number theory2 and asymptotic growth curves!3

    1 This is what provable security means. It does not mean that the theorem has any real-world consequences whatsoever.
    2 This is what ‘standard assumptions’ means here: ugly problems like inverting DES are not allowed, but pretty problems like factoring a product of uniform random primes are allowed.
    3 Not formally stated: We are only interested in asymptotic growth curves of attack costs for families of problems, so DES is doubly disqualified both because it's ugly and because there's only one DES, whereas you can consider the problem of factoring as a function of the size of the factors. (‘Provable security’ treatment with concrete parameter choices, sometimes called ‘exact security’, didn't come until a decade later with the help of Bellare and Rogaway.)

To be fair, the GM paper was helpful for setting down some formalizations like semantic security for public-key encryption, which is essentially equivalent to the modern standard of ciphertext indistinguishability which we use today. But the GM encryption scheme? Completely useless, even if there is a theorem.

This kind of obfuscation—and fetishization of number-theoretic prettiness—leads people down regrettable paths like adopting Dual_EC_DRBG, which admits proofs of ‘number-theoretic-based security’, as major national cryptography standards simply by virtue of being number-theoretic, propelled by prestidigitators like Dan Brown whose principal area of expertise is obfuscating gaping back doors among reams of analytic prose that nobody can get through.

Is RSAP also commonly supposed to be hard? Is thus RSA provably secure?

The RSA problem—computing $e^{\mathit{th}}$ roots modulo a product of large secret primes—is commonly supposed to be hard. There are public-key encryption schemes and public-key signature schemes built out of RSA that have theorems—e.g., RSAES-OAEP, RSA-KEM, RSASSA-PSS, RSA-FDH—which are even somewhat useful in relating the difficulty of breaking the security of the cryptosystem (distinguishing ciphertexts, forging signatures) to the difficulty of computing $e^{\mathit{th}}$ roots modulo a product of large secret primes.

But I recommend that you stay away from the dishonest term of art ‘provable security’ because it obfuscates what you're actually saying—especially in the form ‘X is provably secure’.

Whether a scheme has ‘provable security’ or not is irrelevant to users or engineers making decisions about building systems—the conclusions of cryptanalysts about what cryptography will withstand attack are what's relevant to them. ‘Provable security’ is really about guiding cryptanalysts to focus their effort. There's no point in analyzing AES-CTR separately from AES, because any way to break AES-CTR either (a) depends on on misuse of CTR, (b) is based on AES's nature as a permutation rather than arbitrary function, or (c) implies an attack on AES. That's because there is a useful theorem relating any attack on AES-CTR to an attack on AES itself.

Similarly, there's not much point trying to forge RSA-FDH signatures without trying to compute $e^{\mathit{th}}$ roots, because—thanks to a useful theorem—we know that any work done on one problem translates immediately to the other, as long as the hash function doesn't interact in any interesting way with RSA (which would be quite astonishing). So cryptanalysts can ignore the details of RSA-FDH and focus on computing $e^{\mathit{th}}$ roots to give us confidence in the security of RSA-FDH.

Squeamish Ossifrage
  • 48,392
  • 3
  • 116
  • 223
  • 2
    Why are you speaking about "asymptotic security"? Even it's very common to use a security parameter in provable security, it's not the key point. And you can prove security without using any "asymptotic arguments":.. – Ievgeni Nov 07 '19 at 15:04
  • 1
    Actually I think proofs can give engineers useful guidance, e.g. the GCM proof told us that we probably shouldn't encrypt more than a couple of GB under a given key/iv pair. – SEJPM Nov 07 '19 at 15:11
  • 2
    @Ievgeni I'm speaking about it because it's what the GM paper is all about. Yes, there's concrete provable security literature too, but it didn't come along until later. – Squeamish Ossifrage Nov 07 '19 at 15:14
  • 4
    @SEJPM I'm not saying the theorems are useless. I'm saying the term ‘provable security’ is not helpful—and especially using it as an advertisement for a completely useless asymptotic family of cryptosystems like GM. The security contract for a concrete cryptosystem is what's really important, and that includes safe usage limits. Whether the safe usage limits come from assessments of ECM/NFS costs or from oracle reduction theorems is immaterial to the engineers! – Squeamish Ossifrage Nov 07 '19 at 15:16
  • @Ievgeni All I meant is that the statement in the Wikipedia article (and by reference, the security discussion in the GM paper) is about asymptotics, not about concrete security. I added a note about the subsequent development of concrete security theorems a decade later in the literature. Better? – Squeamish Ossifrage Nov 07 '19 at 15:19
  • 3
    @SqueamishOssifrage Ok... Your sentence "‘Provable security’ is not relevant to users or engineers making decisions about building systems." is controversial, and if you consider "NIST" in your category " engineers making decisions about building systems", it's very controversial... – Ievgeni Nov 07 '19 at 15:21
  • 2
    My statement is not controversial at all. If you think it is controversial, that's because you think ‘provable security’ means something it doesn't. Which is exactly the problem with the term! It really only means there is a theorem; it does not mean that the theorem has any useful consequences whatsoever. – Squeamish Ossifrage Nov 07 '19 at 15:22
  • 1
    I understand your point. You are saying that "alone" the theorem is not relevant? But if the theorem is combined with two non-mathematical facts: The model of security is relevant; The harness assumption in this theorem is really hard for concrete parameters, it becomes usefull even for an engineer... – Ievgeni Nov 07 '19 at 15:25
  • 3
    @Ievgeni Not just that, but also ‘provable security’ covers useless theorems too. I can ‘prove security’—heck, information-theoretic ‘security’—for a 1-bit universal hashing authenticator, but a forgery probability bound of 1/2 is not useful as ‘security’ in any meaningful sense outside the academic notion of whether there is a theorem or not. – Squeamish Ossifrage Nov 07 '19 at 15:28
  • 1
    @Ievgeni See https://crypto.stackexchange.com/a/70522 for more examples of why merely saying ‘provable security’ is misleading. – Squeamish Ossifrage Nov 07 '19 at 15:31
  • 1
    I understand better now what you mean... Thank you for clarification! – Ievgeni Nov 08 '19 at 09:41
7

There is no formal definition of standard assumption, but we usually say that an assumption is standard if it has already been used in several cryptographic schemes and if it is well-accepted in the crypto community.

It usually also implies that several researchers tried to solve the problem and were not able to find efficient ways of doing so, therefore, if we set the parameters correctly, there is no known fast way of solving the problem.

For instance, assuming that the RSA problem is hard is a standard assumption.

But GM is provably secure because it has a proof of security, not because the problem it is based on is a standard problem. Being provably secure (in this case) just means that assuming that the underlying problem is hard to solve, one can prove that the scheme satisfies some security notion (e.g., CPA security). But if the underlying problem is not standard, the scheme is still provably secure.