If the runtime of a recursive algorithm could be expressed as
$T(n) = \begin{cases}O(1) & n \leq c \\ k * T\left(\frac{n}{k}\right) + \left(k + n * k \right)\end{cases}$
what would be the result of the asymptotic analysis (the Landau class)? $k \in \mathbb{N}, k \neq 0 $ is a constant value.
I would say that $k$ even if it is larger than $n$ is still a constant and therefore $T(n) = O(1) * T\left(\frac{n}{k}\right) + O(n) = T\left(\frac{n}{k}\right) * n $ for $n > c$.
So I would say that the algorithm is $O(n * log_k(n))$. Is this right? How could I prove this (formally correct)?
I have two main problems with this algorithm:
1) What is the runtime of $f(n) = (k + n * k)$? Can the $k$ be ignored so the runtime is $O(n)$?
2) I would like to use the Master theorem. I could see $T(n)$ in the form $a \cdot T\left(\frac{n}{b}\right) + f(n)$ with $a = b = k$. But then I need to check if $f(n)$ is $\in O(log_b a) = O(log_1 1) = ?$ This logarithmus is undefined, so can't I use the Master theorem?