Let $A$ a $n\times n$ invertible at left. In fact, I just want to prove that it's invertible at right (the rest is obvious). All what I can say is that there is a $B$ s.t. $BA=I.$ To prova $AB=I$, I have problem. I have that $$AB=AB^2A=BA^2B,$$ but I can't conclude that it's $I$.
-
Is it a square matrix? – Jul 10 '16 at 18:59
-
Yes it is. But an invertible matrix is necessarily square, isn't it ? @G.Sassatelli – user349449 Jul 10 '16 at 19:00
-
You can continue if you would like to: $ = BA^2B^2A = BA AB BA$ – mathreadler Jul 10 '16 at 19:00
-
@mathreadler: And so ? – user349449 Jul 10 '16 at 19:01
-
First show that $B$ has a right inverse $C$. Then $A = ABC = C$ shows that they are equal. – Jul 10 '16 at 19:06
-
1I don't know, but it almost sounded like a sheep when i read it out loud. – mathreadler Jul 10 '16 at 19:07
-
@mathreadler: Not very respectful answer... but funny anyway :-) – user349449 Jul 10 '16 at 19:09
-
1@MathBeginner A non-zero $(1\times n)$ matrix is right invertible, but not left invertible. Implicitisation of hypothesis leads to explicitation of elementary questions. – Jul 10 '16 at 19:18
3 Answers
Assuming $A\in \mathbb{R}^{n\times n}$, let $f_A : \mathbb{R}^n \to \mathbb{R}^n, x\mapsto Ax$ be the associated linear map. Since $BA = I$, we have $f_B\circ f_A = \operatorname{id}$, so $f_A$ is injective. But then $f_A$, as a linear map between finite-dimensional vector spaces of same dimension, is also surjective, hence bijective and so $A$ is invertible.
This is because $n = \operatorname{dim}(\ker f_A) + \operatorname{dim}(\operatorname{im} f_A)$. Since $f_A$ is injective, $\operatorname{dim}(\ker f_A) = 0$, so $\operatorname{dim}(\operatorname{im} f_A) = n$, hence $f_A$ is surjective.

- 12,467
-
2Something tells me this is using techniques not available to the OP yet. – Dan Rust Jul 10 '16 at 19:05
-
I really like your answer (very powerful). But I can't use rank theorem. Sorry. – user349449 Jul 10 '16 at 19:07
-
@MathBeginner Oh okay. Surely, somebody here comes up with a more elementary answer. – Stefan Perko Jul 10 '16 at 19:09
-
2@MathBeginner there is no proof possible without using the condition of finite dimension in some way, counting dimension somehow. This is because the conclusion is false in infinite dimension, the easiest example being the left shift and right shift operators on countable sequences; in one order, the identity, in the other order, no. https://en.wikipedia.org/wiki/Shift_operator#Sequences – Will Jagy Jul 10 '16 at 19:50
-
@WillJagy I believe the OP specifically referred to the rank-nullity-theorem, not just to the finite dimensions. – Stefan Perko Jul 10 '16 at 19:52
If you can try $B = \text{adj}(A)/\det(A)$ and show by brute force computation that $BA = I = AB$.
You would only need $B^jA_i = \delta_{ij}$ and $A^iB_j = \delta_{ij}$. (Superscript means row vector and subscript meaning column vector index, and $\delta_{ij}$ means indicator of $i = j$).
You only need to think about the relationship between the definitions of $\text{adj}$ and $\det$. See https://en.wikipedia.org/wiki/Adjugate_matrix
If additionally you know how transpose relates to inverse, I think you only need $B^jA_i = \delta_{ij}$.

- 5,696
-
-
Both of these quantities have easy expression as terms of entries of the matrix. It would complicate the proof if we also had to show the determinant was non-zero.
I think you may be right in any case that this isn't the spirit of the question but I can't think of a more elementary way right now.
– Mark Jul 10 '16 at 19:53
I'll assume that you have the result that $\det AB=\det A \det B$ and that the determinant of a matrix is the product of its eigenvalues.
Suppose $BA=I$ and that $AB\neq I$. Then there exists $u$ such that $ABu=v$ for some $v\neq u$ and so $BABu=Bv$ hence $Bu=Bv$. This gives us $B(u-v)=0$ and so $u-v$ is an eigenvector associated to the eigenvalue $0$.
It follows that $\det AB =(\det A) (\det B) = (\det A) 0 =0$. But $\det AB = \det I =1$ which is a contradiction.
It follows that if $BA=I$ then $AB=I$.

- 30,108
-
1$ABu = v \Rightarrow AB(u-v) = 0$ Shouldn't it be $ABu = v \Rightarrow ABu - v = 0$ – Patrick Abraham Jul 10 '16 at 20:00
-
-
@PatrickAbraham Actually it becomes slightly simpler using the correct maths – Dan Rust Jul 10 '16 at 20:07
-
You could actually get rid of eigenvalues and eigenvectors, since I don't think the OP can use it. Just go for $w = u - v \neq 0$ and $w \in ker(B)$ – Patrick Abraham Jul 10 '16 at 20:10
-
$\ker B$ being non-zero implying that $B$ is not invertible would take some work though. – Dan Rust Jul 10 '16 at 20:11
-
What would mean $rank(B) \neq n$ what implies $rank(BA) \neq n$. Thus $BA \neq I$ – Patrick Abraham Jul 10 '16 at 20:11
-
-
I guess it really comes down to what tools are at the disposal of the OP. Various results that one would naturally need to prove while progressing through a course in linear algebra would make this question successively more simple compared to proving it from first principals. – Dan Rust Jul 10 '16 at 20:14
-
using the determinant, the result is of course obvious ! So no, I can't use determinant. – user349449 Jul 10 '16 at 22:11