Assume I am interested in solving $$(\underset{k \text{ times}}{\underbrace{g\circ \cdots \circ g)}}(x) = g^{\circ k}(x) = f(x)$$
That is, $g$ is in some sense a function which is a $k$:th root to applying the function $f$. Applying $g$ $k$ times starting with the number $x$ does the same thing as applying $f$ once.
I suspect that without extra constraints on the behaviour of $g$ there got to exist a huge amount of candidates for $g$. Say we consider functions $f\in \mathbf{C}^2$, twice continously differentiable. Is there some way to quantify or classify the solutions $g$ depending on which space they are in?
What constraints can we put on $g$ to narrow down or make nicer the possible solutions?
Own work Some (rather trivial) ones I have found are:
$$\text{for } k = l: f(x) = x^{2^l}, g(x) = x^2$$
And of course (more generally) all polynomials on the form:
$$\text{ for } k=2, f(x) = \sum_{\forall l} a_l \left(\sum_{\forall m} a_mx^m\right)^l, \text{ have } g(x) = \sum_{\forall k} a_kx^k$$
For example maybe the most simple one turning function composition into an addition machine. Imagine a simple computer having only "increment by $b$" instruction to do addition.
$$\cases{g(x)=x+b\\f(x) = g(g(x)) = g(x)+b = (x+b)+b = x+2b}$$
These are all the conceptually most simple ones I could think of, but I am of course interested in how complicated functions $g$ could be while still fulfilling the equation.