I was reading the Cormen, Leiserson, Rivest and Stein textbook, Introduction to Algorithms.
The book explained the three asymptotic notations literally very well. However, there was this paragraph:
Technically, it is an abuse to say that the running time of insertion sort is $O(n^2)$, since for a given $n$, the actual running time varies, depending on the particular input of size $n$.
and this one:
It is not contradictory, however, to say that the worst-case running time of insertion sort is $Ω(n^2)$, since there exists an input that causes the algorithm to take $Ω(n^2)$ time.
I understand why did the author said that "it is an abuse to say $O(n^2)$ is the running time", as there are the inputs with best cases causing linear time and there are also inputs with worst case causing quadratic time.
I don't understand why it is not contradictory to say that the worst-case running time of insertion sort is $Ω(n^2)$. Isn't $\Omega(g(n))$ supposed to be the best running time? So should it not be $Ω(n)$?
It confuses me. Can you please explain why $Ω(n^2)$ is possible for insertion sort when it should be $Ω(n)$?