Consider a generic $1 \leq k \leq n$. We can write the following:
$$\alpha = \sum_{j=1}^n\sum_{i=1}^n a_{ij} x_{i} x_{j} = \sum_{j=1}^n\left(\sum_{i=1, i \neq k}^n a_{ij} x_{i} x_{j} + a_{kj}x_{k}x_{j}\right) = \\
= \sum_{i=1, i \neq k}^n \sum_{j=1}^na_{ij} x_{i} x_{j} + \sum_{j=1}^na_{kj}x_{k}x_{j} =\\
= \sum_{i=1, i \neq k}^n \left(\sum_{j=1, j\neq k}^na_{ij} x_{i} x_{j} + a_{ik}x_i x_k\right) + \sum_{j=1, j \neq k}^na_{kj}x_{k}x_{j} + a_{kk}x_{k}^2 =\\
= \sum_{i=1, i \neq k}^n \sum_{j=1, j\neq k}^na_{ij} x_{i} x_{j} + \sum_{i=1, i\neq k}^na_{ik}x_i x_k + \sum_{j=1, j \neq k}^na_{kj}x_{k}x_{j} + a_{kk}x_{k}^2.\\
$$
Specifically, we have separated all the contributions depending on $x_k$ and those not depending on $x_k$. It is clear now that:
$$\frac{\partial \alpha}{\partial x_k} = \sum_{i=1, i\neq k}^na_{ik}x_i + \sum_{j=1, j \neq k}^na_{kj}x_{j} + 2a_{kk}x_{k}.$$
We can further work on the last expression:
$$\frac{\partial \alpha}{\partial x_k} = \left[\sum_{i=1}^na_{ik}x_i - a_{kk}x_k\right] + \left[\sum_{j=1}^na_{kj}x_{j} - a_{kk}x_k\right] + 2a_{kk}x_{k} = \sum_{i=1}^na_{ik}x_i + \sum_{j=1}^na_{kj}x_{j}.$$
Now, we can try to obtain a vectorial representation. Let's pose:
- $f_k = \displaystyle\sum_{i=1}^na_{ik}x_i,$
- $g_k = \displaystyle\sum_{j=1}^na_{kj}x_{j},$
- ${\bf f} = [f_1, f_2, \ldots, f_n],$
- ${\bf g} = [g_1, g_2, \ldots, g_n],$
where ${\bf f}$ and ${\bf g}$ are row vectors.
It is clear that:
- ${\bf f} = {\bf x}^\top {\bf A},$
- ${\bf g} = {\bf x}^\top {\bf A}^\top,$
and hence:
$$\frac{\partial \alpha}{\partial {\bf x}} = {\bf x}^\top {\bf A} + {\bf x}^\top {\bf A}^\top.$$