Consider the everywhere twice differentiable function $f:\mathbb R^n\to \mathbb R$, the closed and convex set $\mathcal S$, and the convex optimization problem
$$ \min_{x\in \mathcal S} \; f(x). $$
Is there an easy / intuitive way of proving both statements?
$x = x^*$ is a local minimizer if $\nabla f(x^*) = 0$ and $\nabla^2 f(x^*) \succ 0$. Specifically, the condition $\nabla^2 f(x^*) \succeq 0$ is not sufficient, since as a counterexample we can consider $f(x) = x^3$, where at $x = 0$, $\nabla^2 f(0) = 0$ and $\nabla f(0) = 0$, but $0$ is not a minimum.
$x = x^*$ is a global minimizer if $\nabla f(x^*) = 0$ and $\nabla^2 f(x) \succeq 0$ for all $x\in \mathcal S$.
The second statement in particular is quite well-known in convex optimization literature. However, I wonder if there is a nice proof, to reassure ourselves that there are no corner cases (like the one found in case 1).