There are multiple ways to interpret your question. I can think of two ways:
(1) Saying a matrix $M$ is invertible is equivalent (or by definition, depending on your definition) to saying that there exists a matrix $M'$ such that $MM'=I$ and $M'M=I$, where $I$ is the diagonal matrix with only $1$'s. It's possible to define the multiplication of infinite dimensional matrices analogously to the finite case (although the sums may not converge), and we can also define $I$. Therefore, we can say that an infinite-dimensional matrix is invertible if there exists $M'$ such that $MM'$ and $M'M$ are well-defined, and if they both are equal to $I$.
(2) If you want to stick to the idea of linearly independent vectors, there exist infinite-dimensional vector spaces, and the concept of linear independence still exists. For example, $\Bbb R^{\Bbb N}$, the vector space of real-valued sequences, is a vector space, with the zero vector $(0,0,0,\ldots)$ and term-wise sum. We can see that the vectors $a_n=(0,\ldots,0,1,1,\ldots)$, which are $0$ up to the $n^\text{th}$ term and $1$ afterwards, are linearly independent. Therefore, the infinite-dimensional matrix $M$ whose $n^\text{th}$ column is exactly $a_n$ has linearly independent columns, and so we can consider $M$ to be invertible.
I'm not sure if these concepts are equivalent, and in general infinite-dimensional matrices aren't the best way to study infinite-dimensional vector spaces, but I hope I gave you some ideas on what it might entail :)