Regarding the symmetry of the bilinear form: any bilinear form on a vector space $V$ can be decomposed as a sum of a symmetric part and an anti-symmetric one in a unique way, given by
$$B(x,y)=\frac{1}{2}\left(B(x,y)+B(y,x)\right)+\frac{1}{2}\left(B(x,y)-B(y,x)\right)=B_s(x,y)+B_a(x,y).$$
Now if you want to define a norm based on this you find that
$$\|x\|^2=B(x,x)=B_s(x,x)+B_a(x,x)=B_s(x,x).$$
So if you know the norm you can only recover the "symmetric part" of the bilinear form that induced it; and any norm induced by a bilinear form is induced by a symmetric one. Moreover, in that case the symmetric bilinear form is unique, and can be determined by the following formulas (this is the polarization identity mentioned in the blog post you linded to):
$$B(u, v) = \frac{\lVert u\rVert^2 + \lVert v\rVert^2 - \lVert u - v\rVert^2}{2} = \frac{\lVert u + v\rVert^2 - \lVert u\rVert^2 - \lVert v\rVert^2}{2} = \frac{\lVert u + v\rVert^2 - \lVert u - v\rVert^2}{4}$$
(notice that these formulas are indeed symmetric!)
This question could also be of interest about this.
The interest for optimization is that when the norm is quadratic, the inner product allows to compute easily the derivative of the norm. Indeed,
\begin{align}\frac{\|x+tv\|^2 -\|x\|^2}{t}& =\frac{B(x+tv,x+tv)-B(x,x)}{t} \\ & = 2B(x,v)+tB(v,v),\end{align}and taking the limit $t\to 0$, you see that the inner product gives the directional derivatives and thus the differential of the norm.
Knowing that the norm is quadratic is also very useful in problems like linear regression and least squares, because you can see the problem as "finding the vector in a subspace that is the closest to some data", and then the solution is the orthogonal projection on that subspace.