Actually, a more precise statement would be:
For every squared matrix $M$ the following propositions are equivalent:
- $M$ is nonsingular;
- the rows of $M$ form a set of linearly independent vectors;
- the columns of $M$ form a set of linearly independent vectors.
As for the intuition, let's assume we do have a good understanding of why are the first two statements equivalent, which in some way would rely on the fact that a linear system of equations can be expressed as
$$M\vec x= \vec b,$$
where there is an immediate correspondence between rows of $M$ and equations.
I guess is not that hard to feel that unique solution is linked to linear independence of the rows/equations; but uniqueness of the solution is also equivalent to non-singularity of $M$, since in that case is immediate that
$$ \vec x=M^{-1}\vec b$$
(it is immediate that this gives unique solution; the reciprocal is a little less obvious, but let's say is intuitive enough).
Now: what if we form another system of linear equations in which the $x_i$ values in $\vec x$ are given by
$${\vec x}{}^t M ={\vec b}{}^t,$$
which is just as linear as was the system $M \vec x=\vec b$? (Note: $\vec x$ and $\vec b$ here are not necessarily the same as before, but I'll keep the names to simplify notation.)
Since non-singularity leads to the equivalent expression
$${\vec x}{}^t = {\vec b}{}^t M^{-1},$$
how could it be not equivalent to the linear independence of the columns of $M$? Wouldn't it be really odd that this wasn't true but equivalence between non-singularity of $M$ and independence of its rows did hold instead, just because we are writing horizontal lines from top to bottom and vertical lines from left to right?
You could even see that
$${\vec x}{}^t M ={\vec b}{}^t \qquad \iff \qquad M^t \vec x=\vec b$$
and put it in terms of rows again, but now these are the rows of $M^t$ (which of course are the columns of $M$, themselves transposed). And now our initial proposition seems also to be closely related (equivalent, perhaps?) to the property
$$\left(M^{-1}\right)^t=\left(M^t\right)^{-1}$$
which is not difficult to prove nor to understand its details.
It is important to note that this is just a particular case of the more general fact that the dimension of the row space and the dimension of the column space of any matrix (squared or not) are the same, a characteristic of each matrix indeed, known as the rank. Notice that only when the matrix is squared this result translates as linear independence of columns is equivalent to linear independence of rows, since when the number of columns and of rows differ only one of those sets can be linearly independent (the one with less vectors), and it could even be the case that none be independent. Of course, the intuition behind the equivalence of column rank and row rank is a little trickier than our reasoning above.