Showing that a set of vectors is linearly independent can be done by putting them together in a matrix and verifying that it has full rank. Completing a basis can be done by adding a basis for the left nullspace of the matrix with the given basis vectors as columns.
Calculating the rank of a matrix can be done in many ways, see Wikipedia, “Rank (linear algebra)” section “Computation”. Summarized: bringing the matrix into echelon form is a pedagogical approach, but may not be the most numerically dependable.
The left nullspace is the same as the (right) nullspace of the transpose of the matrix. Calculating the (right) nullspace can also be done by bringing the matrix into echelon form, see Wikipedia, “Kernel (matrix)” section “Basis”. Also here, another approach may be numerically preferable, see Wikipedia, “Kernel (matrix)” section “Computation on a computer”.
For your example, starting from
\begin{equation}
A =
\begin{pmatrix}
1 & 1 & 2\\
2 & 1 & 0\\
0 & 1 & 1\\
2 & 0 & 3
\end{pmatrix}
\end{equation}
the (row reduced) echelon form is
\begin{equation}
A_{\text{rref}} =
\begin{pmatrix}
1 & 0 & 0\\
0 & 1 & 0\\
0 & 0 & 1\\
0 & 0 & 0
\end{pmatrix},
\end{equation}
which shows that the matrix has full rank (i.e., $3$).
A basis for the left nullspace is
\begin{equation}
A_{\text{lnull}} =
\begin{pmatrix}
-8\\
1\\
7\\
3
\end{pmatrix};
\end{equation}
you can verify that
\begin{equation}
A_{\text{lnull}}^t A =
\begin{pmatrix}
0 &
0 &
0
\end{pmatrix},
\end{equation}
which shows that the columns of $A_{\text{lnull}}$ (here a single one) are indeed orthogonal to the columns of $A$.