In a surd $a\sqrt{b}$ ($b \in \mathbb{Z^+}$) the value of $b$ can assumed to be a square-free integer ($b = p_1p_2\dots p_k$, where $p_i$ are distinct primes), since otherwise a multiple prime factor can be (completely or partially) taken out of the square root sign.
I am struggling to find an elementary proof of the following:
If $\{a_i\}_{i=1}^n$ are non-zero integers, $\{b_i\}_{i=1}^n$ are distinct positive square-free integers, then $\sum_{i=1}^n{a_i}\sqrt{b_i} \ne 0$.
This is how it can be proven for n=2.
Suppose $a_1\sqrt{b_1} + a_2\sqrt{b_2} = 0$. By squaring the equation, $\sqrt{b_1b_2}$ can be expressed as $$ \sqrt{b_1b_2} = - \frac{a_1^2b_1+a_2^2b_2}{2a_1a_2} $$
RHS of the equation is a rational number. However, since $b_1$ and $b_2$ are distinct and void of multiple prime factors, there should be a prime factor, containing in one of those, but not in another. This factor appears in $b_1b_2$ only once, therefore $b_1b_2$ is not a full square, so LHS $\sqrt{b_1b_2}$ is irrational.
Tried induction — no luck.