As noted in the comments (and possibly guess-able from the tags), the quote is about linear equations. This is, of course, key.
The statement holds in any number of variables:
Theorem. A system of $m$ linear equations in $n$ unknowns,
$$A\mathbf{x}=\mathbf{b}$$
(where $A$ is an $m\times n$ matrix, $\mathbf{x}=(x_1,\ldots,x_n)^T$, and $\mathbf{b}=(b_1,\ldots,b_m)^T$) has either no solutions, exactly one solution, or at least as many solutions as there are scalars; in particular, if there are infinitely many scalars, then the system has either no solutions, exactly one solution, or infinitely many solutions.
Proof. It is enough to show that if the system has more than one solution, then it has at least as many solutions as there are scalars. Assume that $\mathbf{s} = (s_1,\ldots,s_n)^T$ and $\mathbf{t}=(t_1,\ldots,t_n)^T$ are two distinct solutions. Then $\mathbf{s}-\mathbf{t}\neq\mathbf{0}$.
I claim that for any scalar $\alpha$, $\mathbf{s}_{\alpha}=\mathbf{s}+\alpha(\mathbf{s}-\mathbf{t})$ is also a solution. Note that $\mathbf{s}_{\alpha}=\mathbf{s}_{\beta}$ if and only if $\alpha=\beta$ (precisely because $\mathbf{s}-\mathbf{t}\neq\mathbf{0}$), so that means that there are at least as many solutions as there are scalars.
Indeed, we have that $A\mathbf{s}=A\mathbf{t}=\mathbf{b}$; therefore:
$$\begin{align*}
A\mathbf{s}_{\alpha} &= A(\mathbf{s}+\alpha(\mathbf{s}-\mathbf{t}))\\
&= A\mathbf{s} + \alpha A\mathbf{s} - \alpha A\mathbf{t}\\
&= \mathbf{b} + \alpha\mathbf{b}-\alpha\mathbf{b}\\
&= \mathbf{b}.
\end{align*}$$
Thus, $\mathbf{s}_{\alpha}$ is also a solution, as claimed. $\Box$