As I was (and still am lol) struggling on the exercises for symmetric polynomials in my abstract algebra book, I stumbled upon this neat relating the solutions of a polynomial in $\mathbb C$ with the (real) coefficients of the polynomial (I'm sure this can be generalized to complex coefficients). As an example, consider the polynomial $f (x)= x^3 - 5x^2 + 4x- 3$. The fundamental theorem of algebra says that there are 3 roots, which we denote $u, v,$ and $w$, so $f(x) = (x-u)(x-v)(x-w)$.
In the polynomial ring $\mathbb R[t, x_1, x_2, x_3]$, consider $f$ but instead of $u,v,w$ we have $x_1,x_2,x_3$. $$ (t - x_1)(t - x_2) (t - x_3). $$
I don't know how to cite a book but there's this theorem (4.5.3) in Introduction to Abstract Algebra 4th edition by Nicholson that says
Write $s_k = s_k(x_1, \dots, x_n)$ for $1 \leq k \leq n$. Then $$ (t - x_1) (t - x_2) \cdots (t - x_n) = t^n - s_1 t^{n-1} + s_2 t^{n-2} - \cdots \pm s_n = \sum_{k=0}^n (-1)^k s_k t^{n-k}. $$
Here, $s_k$ represents the $k$th symmetric elementary polynomial in $\mathbb R[x_1, x_2, x_3]$. In this case we have
$$ (t - x_1) (t - x_2) (t - x_3) = t^3 - s_1 t^2 + s_2 t - s_3. $$
Remember that this is supposed to be $f$, so the coefficients must match. Hence we set up the following system of equations. $$ s_1(x_1, x_2, x_3) = 5 \\ s_2(x_1, x_2, x_3) = 4 \\ s_3(x_1, x_2, x_3) = 3 $$
If we consider $f$ in $\mathbb R$ again, then we should have $x_1 = u, x_2 = v,$ and $x_3 = w$. So, by expanding the $s_i$ as well, we get the system $$ u+v+w = 5 \\ uv + vw + vw = 4 \\ uvw = 3 $$
And of course this can be generalized to polynomials that are not monic or have an alternating sign, so basically all of them. By approximating the solutions using Wolfram Alpha (link 1 and link 2) (just as a sanity check), these are definitely the roots of the polynomial, expressed in a neat system involving only the roots and coefficients! Is this already a known method? I'm sure it is, so what's its name?