In popular literature, complexities are usually used in a very imprecise manner, often to describe the runtime performance of an algorithm and denoted with "$O$". My question is about these Landau symbols.
Take Quicksort for example, which is often cited as being $O(n^2)$ in the worst case, but $O(n \log(n))$ in the best one. To my understanding though, it is also $O(n^2)$ in the best case, since $n\log(n)$ is asymptotically bounded from above by $n^2$.
My question now is: is there a notion of a "least upper bound" complexity when using Landau symbols? When I call an algorithm $O(n^2)$, does this mean that $n^2$ is the slowest growing function that still bounds the runtime from above, or is this just a moral imperative?
(Similar issues of course apply to other Landau symbols, such as lower bounds.)