There is a theorem that states that eigenvectors $v_1, v_2, \dotsc, v_k$ corresponding to distinct eigenvalues $\lambda_1, \lambda_2, \dotsc, \lambda_k$ are always linearly independent (reference).
If we're in $\mathbb{R}^n$, then the $k$ linearly independent eigenvectors will form a basis for $\mathbb{R}^n$ if $k$ = $n$ (reference). Therefore, we need the algebraic and geometric multiplicities for each eigenvalue to be equal (which is also the condition for $A$ being diagonalizable). The algebraic multiplicity always sum to $n$, and we need $n$ linearly independent eigenvectors (given by the geometric multiplicity). Also, if $n \neq k$, then $P$ is not square, so it won't be invertible.
The $n$ linearly independent eigenvectors will form a basis $\mathcal{B}$. The matrix $P$ is thus the change-of-basis matrix from the standard basis to $\mathcal{B}$. The change-of-basis matrix always has an inverse (proof).
Therefore $P$ is invertible $\Leftrightarrow$ $A$ is diagonalizable $\Leftrightarrow$ algebraic multiplicity = geometric multiplicity $\Leftrightarrow$ $A$ has $n$ distinct eigenvalues ($k = n$) $\Leftrightarrow$ $P$ is square.