The given information tells you that the eigenvalues of $A$ are distinct (in fact the complex third roots of unity), and that the only eigenvalue of $B$ is $1$ (not counting multiplicity here).
And let's assume for purposes of obtaining a contradiction that $AB = BA$. It is often easier to reason with commuting things.
Note that if $v$ is an eigenvector for $A$ corresponding to the eigenvalue $\lambda$, then $Bv$ is also an eigenvector for $A$ corresponding to the eigenvalue $\lambda$. This is because from left multiplying both sides of $Av = \lambda v$ by $B$ (and pulling the scalar out on the right) we deduce $BAv = \lambda Bv$, and because $AB = BA$ we can deduce from this that $ABv = \lambda Bv$. And the vector $Bv$ is nonzero because $v$ is nonzero, and we know that $0$ is not an eigenvalue of $B$.
Note also that because $A$ is $3 \times 3$ and has three distinct eigenvalues, the eigenspaces corresponding to each eigenvalue are one-dimensional (and span $\mathbb{R}^3$). Pick eigenvectors $v_1, v_2, v_3$ corresponding to the distinct eigenvalues $a, b, c$ of $A$.
Fix any eigenvalue $\lambda$ of $A$, and let $v$ be a corresponding eigenvector. Running through the above discussion we learn that $Bv$ is also an eigenvector of $A$ corresponding to the eigenvalue $\lambda$. Since the eigenspace corresponding to $\lambda$ is one dimensional and since $v$ is a nonzero vector in it, $v$ spans this eigenspace. Because $Bv$ is in there, there is therefore a scalar $\mu$ for which $Bv = \mu v$, i.e., $v$ is an eigenvector of $B$ corresponding to the eigenvalue $\mu$. But the only eigenvalue of $B$ is $1$, so $\mu = 1$ and $Bv = v$.
Running through the previous paragraph with $\lambda$ taken to be each of the eigenvalues of $A$, respectively, with corresponding eigenvectors $v_1, v_2, v_3$ as $v$, respectively, we learn that $Bv_1 = v_1$, $Bv_2 = v_2$, and $Bv_3 = v_3$. In other words, $B$ acts as the identity on a basis of $\mathbb{R}^3$ and hence $B = I$.
Note that I never used the minimal polynomials here, just eigenvalues (and we never used what the eigenvalues of $A$ were, other than distinct). You might be able to convince yourself that if $A$ is any $3 \times 3$ matrix with distinct eigenvalues and $B$ has only the eigenvalue $1$, and $AB = BA$, then the same argument just given would show that $B = I$ in that case.
But the minimal polynomial of $I$ is $x-1$, not $(x-1)^3$. So there's your contradiction.
Generalizing this you may be able to convince yourself that if $A$ is $n \times n$ complex matrix with distinct eigenvalues (and hence diagonalizable), and $B$ is matrix commuting with $A$, then $B$ must be simultaneously diagonalizable with $A$ (and hence its minimal polynomial is not going to have repeated roots). Or see e.g. Prove that simultaneously diagonalizable matrices commute and the answers/references cited therein for more.