10

I am wondering that, consider there are $m$ linear equations with $n$ unknowns. We can represent it as $AX=B$. Let $L$ is the left inverse of $A$ therefore $LA=I$. Again from $AX=B$, we get $LAX=LB$ implies $X=LB$. Till this I have no problem but from $X=LB$, multiplying it by $A$ we get $AX=ALB$ implies $B=ALB$. So does it imply also $AL=I$ ?

Rupsa
  • 374
  • 1
    Just as a note, it's easy to show by basic algebra that if $LA=I$, then $(AL)^2=$ $ALAL=$ $A(LA)L=$ $AL$, i.e. $AL$ is idempotent. If it's also invertible, then multiplying both sides by $(AL)^{-1}$ gives $AL=I$, so the only way $AL$ can fail to be the identity is if it's not invertible. – Ilmari Karonen Sep 03 '15 at 16:16

4 Answers4

7

If $A$ is square and of full rank, and $L$ is its left inverse and $R$ is its right inverse, then from

$$LA = I$$

We get (if we multiply both sides by $R$ from the right)

$$LAR = IR\\ LI = IR\\ L = R$$

However, if $A$ is not square, then one of the two inverses does not exist and the other is not unique, so you cannot draw the same conclusion.

For example: $A=[1,1]$ has more than one right inverse: $$B=[\frac12-t, \frac12 +t]$$ is an inverse of $A$ for every $t$, but it has no left inverses, because for every $C\in\mathbb R^{2, 1}$, the matrix $CA$ has rank $1$ or $0$, so it cannot be equal to $I$ which has rank $2$.

5xum
  • 123,496
  • 6
  • 128
  • 204
  • Your conclusion (true for square matrices, false for rectangular matrices) holds true. But I don't understand your argument for square matrices, and I believe that no such argument can work. – darij grinberg Sep 03 '15 at 14:58
  • @darijgrinberg Assumption 1: $LA = I$. Assumption 2: $AR = I$. Then, from assumption 1, it follows that $(LA)R = IR$. Matrix multiplication is associative, so $L(AR) = IR$. From assumption 2, it follows that $LI = IR$. From the properties of $I$, it follows that $L = R$. I see no error in my argument. – 5xum Sep 03 '15 at 15:01
  • 3
    Ah! You are assuming the existence of a right inverse, though. It is better to cite the place you are getting it from, as it is the larger part of the work. – darij grinberg Sep 03 '15 at 15:01
  • @darijgrinberg OK, I agree with that, but my answer is not wrong (as I initially said "if $R$ is its right inverse"). I do agree it is lacking. – 5xum Sep 03 '15 at 15:03
  • 3
    To see that the existence of a left anr right inverse are equivalent, see http://math.stackexchange.com/questions/3852/if-ab-i-then-ba-i – 5xum Sep 03 '15 at 15:08
  • @Auburn As you can see from my answer, that is clearly not true, since $A=[1,1]$ has infinitely many right inverses and zero left inverses. If you transpose it, you get a matrix that has infinitely many left inverses and zero right inverses. What you are saying is only true if $A$ is square. – 5xum Jun 19 '16 at 21:08
6

Certainly not in general.

Let'see this from the point of view of linear maps: $A$ is the matrix associated with a linear map $f\colon\mathbf R^m\to\mathbf R^n$, $L$ is associated with a linear map $u\colon\mathbf R^n\to\mathbf R^m$. $LA=I_m$ means $\;u\circ f=\operatorname{id}_{\mathbf R^m}$, which implies $f$ is injective and $u$ surjective.

On the other hand $AL=I_n$ would mean $\;f\circ u=\operatorname{id}_{\mathbf R^n}$, which would imply $f$ surjective and $u$ injective, whence both would be isomorphisms.

This is of course impossible if $m\neq n$. If $m=n$, we know that for an endomorphism in finite dimension, injective $\iff$ surjective $\iff$ bijective.

Bernard
  • 175,478
3

It's not true for non-square matrices. Consider $$\begin{pmatrix} 1 & 1 & 1 \\ 0 & 1 & 1 \end{pmatrix}\begin{pmatrix} 1 & -1 \\ 0 & 0 \\ 0 & 1\end{pmatrix}=\begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}$$ and $$\begin{pmatrix} 1 & -1 \\ 0 & 0 \\ 0 & 1\end{pmatrix}\begin{pmatrix} 1 & 1 & 1 \\ 0 & 1 & 1 \end{pmatrix} = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 1\end{pmatrix}.$$

Falko
  • 516
3

Case 0. If $A$ is square, then the answer is YES. Use determinants: suppose $LA = I$. Then $\mathrm{det}(L)\mathrm{det}(A) = 1$. So $\mathrm{det}(A)$ is non-zero. So $A$ has a two-sided inverse. Now use:

Proposition. If an element $A$ of a monoid has a two-sided inverse, then every left inverse of $A$ is a two-sided inverse.

So from $LA = I$ we deduce that $AL=I$.

Case 1. If $A$ isn't square, then answer is a big NO. For example, define a matrix $A$ as follows: $$A = \begin{bmatrix}1\\0\end{bmatrix}$$

For each $\lambda \in \mathbb{R}$, define a matrix $L_\lambda$ a follows: $$L_\lambda = \begin{bmatrix}1 & \lambda\end{bmatrix}.$$

Clearly: $$L_\lambda A = 1.$$

But $$A L_\lambda = \begin{bmatrix}1 & \lambda \\ 0 & 0\end{bmatrix}$$

goblin GONE
  • 67,744