This is all good, except that $F(\alpha_1)$ will be the set of all linear combinations of powers $\{\alpha_1^j\}_{j=0}^{n-1}$ where $n = \deg(f)$, not just linear combinations of $\alpha_1$ itself.
You can see that this set of powers is linearly independent over $F$ because if there did exist a nontrivial linear combination of these equaling zero$^\dagger$—i.e. if we had $\displaystyle \sum_{k=0}^{n-1} c_k\alpha_1^k = 0$ for some collection of not-all-zero coefficients $c_k$ from $F$—then the polynomial $g(x) = \displaystyle \sum_{k=0}^{n-1} c_kx^k$ would be of smaller degree than $f$ despite having $\alpha_1$ as a root. But this can't be since $f$ was the minimal polynomial of $\alpha_1$.
Moreover, this set of powers must be an exhaustive list of the generating elements for $F(\alpha_1)$ because the degree of the extension $[F(\alpha_1):F]$ is equal to the degree of the minimal polynomial of $\alpha_1$, and the degree of the extension is literally how many "basis vectors" $F(\alpha_1)$ has when viewed as a vector space over $F$.
Finally, we'd actually have $F(\alpha_1) \cong F(\alpha_j)$ for any $1 \leq j \leq n$. To see this, first recall that $F(\alpha_1) \cong F[\alpha_1]$ because $\alpha_1$ is algebraic over $F$ (see my post here). Thinking of $F(\alpha_1)$ as $F[\alpha_1]$, we can show that $F[\alpha_1] \cong F[\alpha_j]$ via different homomorphisms out of $F[x]$:
For each $j$, define a homomorphism $\phi_j:F[x] \to F[\alpha_j]$ to be the map $g(x) \mapsto g(\alpha_j)$; that is, we evaluate each polynomial at $\alpha_j$. You can prove for yourself (for instance, using the fact that univariate polynomial rings over fields are principal ideal domains) that the kernel of each $\phi_j$ must be the principal ideal generated by the minimal polynomial of $\alpha_j$, namely $f$. Applying the first isomorphism theorem for rings to each homomorphism, we see that $F[\alpha_j] \cong F[x] / \langle f \rangle$ for every $j$ between $1$ and $n$. By transitivity of congruence, then, we must have $F[\alpha_1] \cong F[\alpha_j]$ for every $j$.
$^\dagger$ Just applying the fact that a set of vectors $\{ \mathbf{v}_k \}_{k=1}^n$ in a vector space $V$ over a field $F$ is linearly dependent $\iff$ there exists elements $c_k \in F$, not all zero, such that $\displaystyle \sum_{k=1}^n c_k \mathbf{v}_k = \mathbf{0}$.