Let $f: [0,1]^n \rightarrow \mathbb{R}_+^n$ be a vector-field where the function $f_i$ for each dimension $i$ is defined as:
$$ f_i(\vec{x}) = \sqrt{ \sum_{j=1}^n x_i \cdot x_j \cdot C_{i,j}} $$
Where we are given the $n \times n$ matrix denoted $C$ with elements $C_{i,j} \in [0,1]$ for all $i$ and $j$, and the diagonal equals 1 so $C_{i,i}=1$, and the other elements are symmetrical so $C_{i,j} = C_{j,i}$.
We are also given $\vec{a} = [a_1, a_2, .., a_n]$ with $a_i \in [0,1]$ for all $i$.
Question 1: Does a solution $\vec{x} \in [0,1]^n$ exist that causes $f(\vec{x})$ to equal $\vec{a}$?
Question 2: Is $\vec{x}$ unique?
Question 3: How can we find $\vec{x}$?
Please explain in a manner that can be understood by amateur mathematicians like myself.
Note that I have found a couple of numerical algorithms to solve this problem, one is shown in another question, and another way is to derive the inverse of $f_i(\vec{x})$ and use that to find each $x_i$ that causes $f_i(\vec{x}) = a_i$, but this causes $f_j(\vec{x}) \neq a_j$ for the other dimensions $j$, so we need to do this for several iterations and then it converges to the correct solution. However, I am having trouble formally proving the convergence of those algorithms, so I would like to know if we can at least claim that a unique solution even exists, using only mathematical arguments?
Thanks!