3

According to this link:

We need the notation for the lower bound. A capital omega $\Omega$ notation is used in this case.

We say that $f(n) = \Omega(g(n))$ when there exist constant $c$ that $f(n) \geq c \cdot g(n)$ for for all sufficiently large $n$. Examples:

  1. $n = \Omega(1)$
  2. $n^2 = \Omega(n)$
  3. $n^2 = \Omega(n \cdot \log n)$
  4. $2n + 1 = O(n)$

I need to verify whether $T(n^2) = Ω(n)$. Because I can't seem to think that a quadratic run-time can have a linear lower bound run-time.

I am confused so I might be wrong though.

hsnsd
  • 217
  • 2
  • 8

1 Answers1

2

You haven't said what the function $T$ is, so let me guess that what you need to verify is that $n^2 = \Omega(n)$. Note that this statement has nothing to do with the running time of any algorithm. It is purely a statement about two functions, $n^2$ and $n$.

When trying to prove something, it's always a good idea to use the definition. In order to show that $n^2 = \Omega(n)$, we need to find $c > 0$ so that $n^2 \geq c \cdot n$ holds for all large enough $n$. I'm sure you can think of such a $c$ that works for all $n \geq 1$.

If you have an algorithm running in time $\Theta(n^2)$, then its running time is also $\Omega(n)$. This just means that the running time is linear or worse. In particular, a quadratic running time always has a linear time lower bound, just as $2 \geq 1$ holds although $2 \neq 1$.

For more information, check our reference questions on asymptotic notation and on running times and asymptotic notation.

Yuval Filmus
  • 276,994
  • 27
  • 311
  • 503