I'm not quite sure how to write this succinctly with mathematical symbols, so I just had to write it out in english. Any edit to suggest how to write it in mathematical form would be appreciated even if it does not provide a full answer.
If you have a set of linearly dependent vectors in some space in $N$ dimensions, then that set of vectors can be thought of existing in a $N-1$ dimensional subspace of the space that the vectors exist in. (eg: In 3 dimensions a set of vectors that are linearly dependent on one another exist on a plane, a 2 dimensional subspace, in that 3 space)
If you apply a rotation matrix to the a matrix of linearly depended vectors then it makes sense to me that you could rotate this subspace so that one of the vector components is zero for all of the vectors in that matrix, producing a zero column.
This method makes sense as a zero row in a square matrix would give a determinant of zero as row reduction gives a zero row. The thing I want to know is have I miraculously come to the right conclusion through wrong methods and therefore the method will not work in some cases, or is the method sound and work for all cases described?