How much time would be needed to compute a minimum k-cut, which has time complexity $O(n^k)$ on a graph with $n=5000$ and $k=30$? Consider using a basic 2GHz and 8GB RAM laptop.
Please explain also the process of how to compute the answer.
How much time would be needed to compute a minimum k-cut, which has time complexity $O(n^k)$ on a graph with $n=5000$ and $k=30$? Consider using a basic 2GHz and 8GB RAM laptop.
Please explain also the process of how to compute the answer.
An algorithm has time complexity $O(n^k)$ if there exists a constant $C>0$ such that the number of operations that the algorithm performs in the RAM model on an input of length $n$ is at most $Cn^k$.
There are several reasons why this doesn't allow you to determine how much time the algorithm will take on a given machine for a given value of $n$:
Therefore the only way to really know how much time a certain algorithm would take is to code it and empirically measure its running time.
What good is asymptotic analysis, then? The abstract machine is a reasonable model of real computers, so it is likely that the running time will scale like $n^k$ (assuming $O(n^k)$ is a tight upper bound). If $k$ is large, then likely the algorithm isn't practical. So asymptotic analysis allows you to compare different algorithms on paper rather than in practice. Of course, sometimes this comparison will result in the wrong conclusion. This is the case for fast matrix multiplication, for instance.
Let's make the unrealistic assumption that each step of your algorithm takes in your machine just one cpu cycle. That means that each step takes 0.5 ns. In the worst case the running time will be $5 \cdot 10^{-10} \cdot (5 \cdot 10^3)^{30} s \approx 4.7\cdot 10^{101}$ seconds which is about $1.5\cdot 10^{85}$ billion years or about $10^{84}$ times the life of universe. So I wouldn't suggest to actually measure it. The only hope is for average case the algorithm to be asymptotically more efficient, but I don't know much of this algorithm to help you further.
You can't really say how long it will take. Look at Curious_Dim's answer, which is not at all unrealistic. However, I have a different algorithm that has a different result and the same asymptotic behaviour:
"Given n items, first apply a O(n) algorithm which keeps at most n/10000 items, and then calculate a minimum k-cut for these items". Same asymptotic behaviour, but with a time constant that is smaller by a factor $10^{120}$. So for n = 5,000 the runtime is practically zero..