There is no "the" eigenvectors for a matrix. That's why the statement in Wikipedia says "there is" an orthonormal basis...
What is uniquely determined are the eigenspaces. But you can make different choices of eigenvectors from the eigenspaces and make them orthogonal or not (and of course you can go in and out of "orthonormal" by multiplicating by scalars). In the special case where all the eigenvalues are different (i.e. all multiplicities are $1$) then any set of eigenvectors corresponding to different eigenvalues will be orthogonal.
As a side note, there is a small language issue that appears often. This is that matrices have eigenvalues, but to talk about eigenvectors you are seeing your matrix as a linear operator on a vector space (and that's of course where the notion of eigenvalue comes from)
To see a concrete example, consider the matrix
$$
\begin{bmatrix}1&0&0\\ 0&0&0\\ 0&0&0\end{bmatrix}
$$
The orthonormal basis the Wikipedia article is talking about is $\begin{bmatrix}1\\0\\0\end{bmatrix}$,
$\begin{bmatrix}0\\1\\0\end{bmatrix}$,
$\begin{bmatrix}0\\0\\1\end{bmatrix}$.
But as the multiplicity of zero as eigenvalue is $2$, we can choose a different basis for its eigenspace, and then $\begin{bmatrix}1\\0\\0\end{bmatrix}$,
$\begin{bmatrix}0\\1\\1\end{bmatrix}$,
$\begin{bmatrix}0\\2\\1\end{bmatrix}$ is another (not orthogonal) basis of eigenvectors. Worse, we can get a different orthonormal basis of eigenvectors $$\begin{bmatrix}\pm 1\\0\\0\end{bmatrix}, \qquad
\begin{bmatrix}0\\ a\\ \sqrt{1-a^2} \end{bmatrix},\qquad
\begin{bmatrix}0\\ \sqrt{1-a^2}\\ -a\end{bmatrix}$$ for each $a\in[0,1]$.
Finally, if you don't want a basis, you can have an infinity of eigenvectors: for instance all vectors of the form $\begin{bmatrix}t\\0\\0\end{bmatrix}$, for any $t$, are eigenvectors. And all vectors $\begin{bmatrix}0\\t\\s\end{bmatrix}$, for any $t$ and $s$, are eigenvectors.