From the definition of the $\Theta$-notation, $$f(n)=\Theta(g(n))\\\implies \exists n_0, \exists c_1,c_2\gt 0, \forall n\gt n_0, c_1\cdot g(n)\le f(n)\le c_2\cdot g(n)$$
We can see that the inequality is followed for all $n\gt n_0$, and hence we can say that $c_2\cdot g(n)$ is the maximum that this function can reach. Therefore, $f(n)=O(g(n))$ as well.
If we take the example of quicksort, sources say that the $\Theta$ complexity is $n\log{n}$, but $O$-complexity is given as $n^2$.
If the $O$-complexity given is true, then $f(n)\le c_2\cdot n\log{n}$ will not always be true. So in this case, how is $\Theta$-complexity different from $O$-complexity?