In this answer I've explained a fairly easy method to compute minimal polynomial of a square matrix$~A$ without having to compute the successive powers of$~A$ and testing them for linear dependency. Instead choose any nonzero vector $v$ and compute successive images $A^0v=v$, $Av$, $A^2v$, until the sequence of vectors becomes linearly dependent. If the first dependency is $c_0v+c_1Av+\cdots+c_dA^dv$, where one can choose to have $c_d=1$ (by scaling the relation if needed), then the polynomial $C=c_0
+c_1X+\cdots+c_{d-1}X^{d-1}+X^d$ will certainly divide the minimal polynomial $\mu_A$. The (yet unknown) remaining factor $\mu_A/C$ can now be found as the minimal polynomial of the restriction of (left-multiplication by)$~A$ to the subspace $\operatorname{Im}C[A]$. (This sounds scarier than it is. That subspace is often quite low dimensional, often zero; in the latter case the remaining factor is just $1$.)
In this example, the simplest choice $v=(1,0,0,0)$ will in fact give you the minimal polynomial right away. Successive images are $Av=(0,-1,-1,0)$, $A^2v=A(Av)=(-2,0,0,2)$, $A^3v=(0,4,4,0)$, which is where the sequence becomes linearly dependent, with the obvious relation $4Av+A^3v=0$. This gives the polynomial $C=4X+X^3$ which satisfies $C[A]v=0$ by construction, but it also satisfies $C[A](A^kv)=0$ for $k=1,2,3$ (because the factor $A^k$ commutes with $C[A]=4A+A^3$). So to find the image subspace of $C[A]$ we can apply it to a basis of a complementary subspace to the span $W$ of $[A,Av,A^2v,A^3v]$; here $W$ is $3$-dimensional, any complement of $W$ therefore $1$-dimensional, and any vector not in $W$ will generate such a complement. Choosing such a vector $v'=(0,1,0,0)$ one finds $Av'=(1,0,0,-1)$, and $A^3v'=(-4,0,0,4)$ so that $C[A]v'=4Av'+A^3v'=0$ and the image of $C[A]$ turns out to be $0$-dimensional. The remaining factor to get from $C$ to $\mu_A$ is therefore $1$, and $\mu_A=C=4X+X^3$.
It would have been a bit more fun to start with a more "clever" choice taking $v=(0,1,1,0)$ instead (since the matrix seem to have some relation to vectors with equal second and third components); then $v$, $Av=(2,0,0,-2)$ and $A^2v=(0,-4,-4,0)$ are already linearly dependent, giving a polynomial $D=4+X^2$; this divides $\mu_A$ but is not yet $\mu_A$ itself. Now the span of $[v,Av,A^2v]$ is $2$-dimensional and a complement is for instance spanned by $[(1,0,0,0),(0,1,0,0)]$. Computing $4v'+A^2v'$ where $v'$ runs through those two vectors gives
$(2,0,0,2)$ respectively $(0,2,-2,0)$, and the remaining factor of $\mu_A$ will be the minimal polynomial of $A$ restricted to the $2$-dimensional subspace spanned by $[(1,0,0,1),(0,1,-1,0)]$. But the restriction of $A$ to that subspace is zero, and the minimal polynomial of the restriction therefore $X$ (note that the minimal polynomial of a zero operator on any space of positive dimension is $X$), leading to $\mu_A=(4+X^2)X$, indeed the same answer as before.