2

Status of this question: The notion I'm after is vague in my mind. I hope someone can clarify whether there is an exact version of the question I'm asking.

It seems to me that one of the most important reason why a mathematician would be interested in trying to find "necessary and sufficient conditions" for some notion $A$ (please tell me if you disagree), is to find computable or "easily recognizable" conditions for a notion $A$. In this case, there are two notions:

  • A notion $A$ that we are interested in, but which is not "directly" easily checkable (e.g. to "directly" check whether $x$ is a local extremum we'd need to check an uncountable number of points)

  • A notion $B$ that is "directly" easily checkable (e.g. checking the first derivative of $x$ is easy, and allows us to indirectly check $A$).

But the notion of "local optimum" does not seem to have generally applicable satisfying necessary and sufficient conditions (only for differentiable functions, but not for arbitrary functions).

So it seems that it is not always possible to find necessary and sufficient conditions for some notion.

My questions are:

  • Is there some kind of metamathematical analysis of when it is possible to formulate necessary and sufficient conditions for some mathematical notion? (My understanding of my question is too vague to say definitively, but I presume this has to do with computability, and maybe kolmogorov-complexity?)

  • The main motivation behind the previous question: When you're doing research and would like to have necessary and sufficient conditions for some notion $X$, is it possible to make good educated guesses about whether such conditions are even theoretically possible?

user56834
  • 12,925

1 Answers1

4

As an interesting bit of trivia, there are proofs of arbitrarily large minimum complexity, hence there are arbitrarily large minimal necessary and sufficient conditions. You're effectively (approximately) asking the meta question "Are there necessary and sufficient conditions for determining if there are necessary and sufficient conditions for a given problem", and no such thing could exist without solving the Halting Problem (at least, I'm pretty sure of that).

As to your actual main motivation, it is absolutely possible to make good educated guesses about whether such conditions are even theoretically possible. To work with an analogy that hopefully fits into your math/programming background somewhere, take the chromatic number of a graph as an example. Actually computing such a thing is an NP-Complete problem, but we have many excellent heuristics both for approximating it and for bounding it.

Many non-trivial facts have a similar property. It takes an obscene amount of work to prove them from a cold start, but with enough background they are so trivial you no longer need to even go through the mechanics of a proof. The more you know, the more approaches and facts you'll be able to trivially rule out (with the approximations, bounds, and heuristics you develop while learning about mathematics) without ever "wasting" time actually attempting them.

Tangentially to that, in addition to heuristics which rule out research areas we also have heuristics which allow one to hone in on other research areas. Over time, I've found that I tend to have success improving seemingly discrete algorithms by finding a way to work a derivative into the picture. I have a rough idea of what a successful proof in that style looks like, and given a new problem/question I can gauge whether or not I think incorporating derivatives is worth my time.

As to the metamathematical analysis you are looking for, I'm really not certain if any necessary conditions or any sufficient conditions are readily available for such use. The general vein of study would be proof complexity, and I think you're on the right track, but I don't know enough to answer or help any further.

  • "no such thing could exist without solving the Halting Problem (at least, I'm pretty sure of that)." Could you explain why you believe this to be true? – user56834 Oct 04 '18 at 14:33
  • Take $P(y)$ to be a proposition with $y\in Y$ and an NSC to be any $f:Y\to {0,1}$ with the properties that $f$ is "easily computable", that $f(y)=1\iff P(y)=1$, and that proving $f=P$ is "easily computable". I believe this agrees with your notion of NSC; correct me if I'm wrong (note that all terminating parametrized propositions can written thusly for some $Y$). Knowing such an $f$ exists implies that an algorithm for computing $P$ exists which halts. The converse isn't necessarily true and depends heavily on what we mean by "easily computable". If we take $P(y)$ to be "does $y$ halt?", – Hans Musgrave Oct 05 '18 at 03:44
  • then the equivalence is established. That's a rough justification anyway. My original statement may very well be false, but when I look at your question I see undertones of the Halting Problem running through it. – Hans Musgrave Oct 05 '18 at 03:46