Theorem. An $n \times n$ matrix $A$ is invertible if and only if $Ax = 0$ has only the trivial solution.
Proof. Suppose $A$ is invertible. To solve $Ax = 0$, we pre multiply the equation sequentially by elementary matrices to get the system $Rx = 0$, where $R$ is a row-reduced echelon matrix. Since elementary matrices are invertible, and products of invertible matrices are invertible, we see that $A$ is invertible if and only if $R$ is invertible. But a square row-reduced echelon matrix is invertible if and only if it is the identity, in which case $Rx = 0$ has only the trivial solution $x = 0$.
Conversely, if $Ax = 0$ has only the trivial solution, then so does $Rx = 0$ where $R$ is the row reduced echelon form of $A$, and $Rx = 0$ has the trivial solution if and only if $R$ is the identity matrix, and $R = I$ if and only if $A$ is invertible. $\tag*{$\blacksquare$}$
This is why if $(\lambda I - A)v = 0$ for a nonzero vector $v$, then $\lambda I - A$ must be non-invertible.
Consider the equation $f(x) = \det(xI - A)$. This is a polynomial equation, and if $\lambda$ is an eigenvalue then the above discussion shows that $\det(\lambda I - A) = 0$, or in other words that $\lambda$ is a root of $f$. Conversely, if $\lambda$ is a root of $f$ then $\det(\lambda I - A) = 0$, so $\lambda I - A$ is singular, so $(\lambda I - A)v = 0$ has a solution for a nonzero $v$. In other words, $\lambda$ is an eigenvalue of $A$ with eigenvector $v$.