You're right about the second one, but your reasoning for both is sketchy. The definition of $f(n)=O(g(n))$ is that there exist constants $c,N$ both greater than zero such that $f(n)\le c\cdot g(n)$ for all $n\ge N$.
Trying a single value for $n$ won't generally give anything in the way of an answer. Consider, though, the intuition we get from trying $n=4, 8, 16, 32, \dotsc$:
$$\begin{array}{lcccc}
n: & 4 & 8 & 16 & 32 & 64 & 128\\
\log^2 n: & 4 & 9 & 16 & 25 & 36 & 49
\end{array}$$
It seems to be the case that $n$ grows much faster than $\log^2n$. In fact, for this example we have that at least for powers of 2, $\log^2(n)=\log^2(2^k)=k^2\le 1\cdot2^k=1\cdot n$ for all $n \ge 16$, so we might suspect that $\log^2n=O(n)$, which is indeed the case, as it happens.
What is most helpful is to get an intuition of the big-O relationship between frequently-used functions. Here are some examples (all of which should be proved, but that'll be up to you do do or look up). Assuming everything in sight is positive, then:
- $\log n = O(n^k)$ for any $k>0$. (polynomials beat logs, eventually)
- If $a \le b$, then $n^a=O(n^b)$. (polynomials behave as expected, by degree)
- If $a>1$, then $n^k=O(a^n)$. (exponentials beat polynomials)
and so on. The point here is that it's far better to develop your intuition and only use the definition of big-O when faced with a problem where intuition doesn't give you any help. That's what the experts do.