Since a matrix over a field satisfies its characteristic polynomial, which is degree $n$, linear combinations of powers of a matrix can at best yield an $n$ dimensional vector subspace of the space of all matrices. Hence, you can never achieve the entire space of matrices for $n>1$.
In greater detail: let $B$ be an $n \times n$ complex matrix and $p(x)$ be its characteristic polynomial. By the Cayley-Hamilton theorem, $B$ satisfies $p(B)=0$. For the sake of clarity, what I mean is if
$$ p(x)=a_0+a_1x+a_2x^2 +\dots +x^n $$
then
$$ a_0I+a_1B+a_2B^2+ \dots +B^n $$
is exactly the zero matrix. (Here, $I$ represents the identity matrix.)
So we have the relation
$$ B^n = -a_0I-a_1B-a_2B^2- \dots -a_{n-1}B^{n-1} .$$
Now you can use that relation recursively to write an arbitrary (finite) linear combination of powers of $B$ as a linear combination of $\{I,B,B^2,\dots,B^{n-1}\}$. (If it's not clear: warm-up exercise: write $B^{n+1}$ as a linear combination of $\{I,B,B^2,\dots,B^{n-1}\}$.)
It might be interesting for you to know that the space spanned by powers of $B$ is related to the space of matrices which commute with $B$. This is why the hint by @Kelenner works, which frankly may be easier than the answer I provided.
Another related concept is the relationship between the minimal polynomial of $B$ and the characteristic polynomial of $B$. Really, all of this information can be deduced from the invariant factor decomposition of $\mathbb{C}^n$ induced by $B$. For more information you can see my question here, though it uses a moderate amount of abstract algebra you may be unfamiliar with.