Consider the matrix $A$ whose columns are the vectors $v_1, \dots, v_n$ from the $m$-dimensional vector space $V.$ We have that $A^t$ (the transpose of $A$) is the matrix whose rows are the vectors $v_1, \dots, v_n.$
Like you mentioned, we can determine which of the vectors $v_1, \dots, v_n$ are linearly independent by putting the matrix $A$ in reduced row-echelon form. Explicitly, the vectors $v_j$ for each column $j$ with a pivot are linearly independent, and the rest of the vectors can be written as a linear combination of the $v_j.$ (In fact, the coefficients of the relation of linear dependence are determined by the entries of the column of the corresponding vector. Check that for $v_1 = (1, 2, 5)$ and $v_2 = (-3, -5, -13),$ we have that $v_4 = (2, 1, 4) = -7v_1 - 3v_2.$ Compare this to the problem you mention.)
Like you have seen, there are other advantages of writing the vectors as the columns; however, it is true that the column rank and the row rank of any matrix are equal, hence the number of pivots of the matrix $A^t$ is the same as the number of pivots of the matrix $A,$ and one can just as legitimately row-reduce $A^t.$ Unfortunately, the interpretation of the coefficients of the matrix $A^t$ is not as clear as in the other method. (Check that the transpose of the matrix in the problem you mention is $$\begin{pmatrix} 1 & 0 & 1 \\ 0 & 1 & 2 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix};$$ however, the coefficients of this matrix have no immediate meaning.)