In the lectures I am following, we are trying to show that
$AB = I \implies B=A^{-1}$, given that A and B are $n \times n$ square matrices. Of course we don't know if A and B are invertible or nonsingular etc. First we need to show these. It follows in the lectures that for a $\vec y \in R^n$,
$$A(B\vec y) = \vec y$$ and thus for every $\vec y$, there is a solution $B \vec y$. Thus the system is consistent with its coefficient matrix as $A$. Then the proof says $A$ must be nonsingular.
I am lost at this reasoning. How did we jump to the fact that $A$ is nonsingular? How can I know that $B \vec y$s are unique for all $\vec y$? Maybe the system has infinitely many solutions?
For reference, this is from Theodore Shifrin's Math 3500 Lectures on Youtube, Day 33, around time 35:00.