4

Has there been any attempt at a general theory to describe how an algorithm can be "deformed" to solve the problem more efficiently?

For example suppose we have an algorithm (say sorting a list of numbers) which solves the problem in $O(n^2)$. Can we deform this algorithm (in the "space of algorithms") to an algorithm which solves the problem in $O(n \log(n))$ time?

My motivation for asking this questions comes from analysis where if we want to solve the equation $f(x) = 0$ one technique is to first guess an $x_0$ such that $f(x_0)$ is small, then look in the neighborhood of that $x_0$ to find $x$.

Of course, I'm guessing the answers is no, but I'm sure a question similar to the above must have been posed somewhere in the literature somewhere.

Raphael
  • 72,336
  • 29
  • 179
  • 389
tom
  • 59
  • 2
  • 4
    Define what "deform" means and what makes one algorithm "better" than another, and off you go. I don't see a conceptual problem. – Raphael Oct 12 '17 at 17:49
  • Sure it's easy to deform an algorithm, but is there a way to do that such that you're solving the same problem. – tom Oct 12 '17 at 18:14
  • That's easy, too, but transformations that obviously don't change the computed function are probably useless. That said, you just have to make correctness part of the fitness function! – Raphael Oct 12 '17 at 18:21
  • Are you trying to do some sort of gradient descent on algorithms to improve it? For some specific class of "algorithms", we have this paper: Learning to learn without gradient descent by gradient descent – justhalf Oct 12 '17 at 19:30
  • "Are you trying to do some sort of gradient descent on algorithms to improve it?" Yes, that was my initial motivation for asking this question. – tom Oct 12 '17 at 19:32
  • Isn't this similar to the ideas behind homotopy type theory? – Per Alexandersson Oct 12 '17 at 20:41
  • There is active research into training a "differentiable computer" using gradient descent. I don't have time to make an answer right now, but here is a paper: https://arxiv.org/pdf/1410.5401.pdf – QuadmasterXLII Oct 13 '17 at 01:15

1 Answers1

11

There is no general way to do this. The "space of algorithms" is not a nice one, with a natural metric or other nice properties, unlike e.g. the real numbers. Note that even in the case of trying to solve $f(x)=0$, where your search space is $\mathbb{R}$, most algorithms work under several assumptions on $f$, e.g. continuity (there is no algorithm which can solve/approximate $f(x)=0$ for an arbitrary $f:\mathbb{R}\rightarrow\mathbb{R}$).

See the answers here, and also here for simple impossibility results regarding a general approach for optimizing the running time of an algorithm.

Ariel
  • 13,369
  • 1
  • 21
  • 38
  • 3
    You could make a deformation metric over expressions of algorithms (which is different from algorithms: defining a metric that's invariant if you make tweaks to how the algorithm is defined would be a lot harder). But if you try to deform an algorithm, you'd almost always get one that does something different, so looking for equivalent algorithms that way would be very difficult and inefficient. – Gilles 'SO- stop being evil' Oct 12 '17 at 17:11
  • I'm not sure I agree in this generality. For a given problem $P$, there is are natural ways to define fitness functions for algorithms: number of correctly solved instances among a test set; number of machine steps taken on some test instances; etc. It also seems feasible to define small mutations of algorithms, if they are given in a formal language. In summary, we can certainly perform randomized searches in the space of all algorithms (of a certain class, maybe). Whether that's effective (let alone efficient) is highly doubtful, though. – Raphael Oct 12 '17 at 17:47
  • There are (academic) tools, for instance, that find functions and correctness proofs for data types (in a functional language), basically by enumerating all programs and proofs. – Raphael Oct 12 '17 at 17:48
  • @Raphael I have no knowledge of useful mutations (I suppose the choice of language doesn't matter much) in the sense that they preserve semantics (or at least, syntactic mutations which allow you to make semantic statements on the outcome). I know next to nothing about automated proving, but I guess that positive/constructive results apply to very specific cases (this of course does not mean they are not useful in practice). The general task however, as stated in the question, is hopeless. – Ariel Oct 12 '17 at 18:41
  • 3
    @Ariel Certainly, it's uncomputable! – Raphael Oct 12 '17 at 18:43
  • 1
    The "space of algorithms" quote made me want to ask if there's some category of complexity, but appears that is answered in the negative. – BurnsBA Oct 13 '17 at 00:34