I'm wondering if the following basic facts about matrix inversion are correct:
1) For any matrix $A$, if $A$ has a right and left inverse, these are equal, unique and called the inverse of $A$
2) If $A$ is square, then any left inverse is a right inverse and vice versa
3) If $A$ is $m$x$n$ with $n > m$, then $A$ can only have a right inverse, and if it does, then it has infinitely many (question: it has infinitely many if and only if the characteristic of the underlying field $\mathbb{F}$ is $0$?)
4) If $A$ is $m$x$n$ with $n < m$ we have the exact same conditions as 3 with "right" replaced by "left".
Typically 2 is proven using elementary matrices, is there a way to do this using the finite dimensional linear operator point of view rather than the matrix point of view. For example, for $A$ $m$x$n$ with $n > m$ you can argue that $Ax = 0$ has non-trivial solutions either from considerations concerning free variables or by using the dimension formula.
Similarly, for 3, you can argue that if $B$ is a right inverse of $A$ then since $Ax = 0$ has non-trivial solutions, we can find another matrix $B'$ s.t. $AB' = 0$ (by taking for the columns of $B'$, scalar multiples of $x_0$, i.e. $b_j = c_j \cdot x_0$ for some non-trivial solution $x_0$ to $Ax=0$) and we see that the family $B + B'$ gives infinitely many right inverses of $A$. Is there a good way to see this from the linear operator point of view?
Finally, are there good rules of thumb or heuristics for identifying the types of problems where one point of view may be more appropriate, efficient, or insightful?