I understand that the complexity of a problem can be measured by it's best-case, average-case, or worst-case complexity. Am I correct in thinking that, for cryptographic purposes, each of these is of interest?:
Best-case: If we can find a problem whose best-case complexity is high, then it will be high for all instances of that problem, so this problem would be great to base a cryptosystem on.
Worst-case: If a problem has high complexity in the worst case, we could look for a large set of instances for which the problem has worst-case complexity, and randomly select keys from this set.
Average-case: Wikipedia says that average-case complexity analysis can be used to generate hard instances of a problem.
There does seem to be some conflicting opinions between the best measure of complexity. For example, I have read that NP-complete problems (using worst-case) are desirable for the reason above, but I have also read that they are undesirable because we want the problem to be hard over random inputs.