3

This is Exercise 7, page 21, from Hoffman and Kunze's book.

Let $A$ and $B$ be $2\times 2$ matrices such that $AB=I$. Prove that $BA=I.$

I wrote $BA=C$ and I tried to prove that $C=I$, but I got stuck on that. I am supposed to use only elementary matrices to solve this question.

I know that there is this question, but in those answers they use more than I am allowed to use here.

I would appreciate your help.

  • You can prove this in a low-tech way using row reduction. – Qiaochu Yuan Feb 21 '12 at 19:32
  • I agree with your answers, but I cannot use $\det$ here, just elementary matrices. –  Feb 21 '12 at 19:38
  • related: http://math.stackexchange.com/questions/3852/if-ab-i-then-ba-i/3895#3895 – JavaMan Feb 21 '12 at 19:43
  • @JavaMan: None of those answers can help me. –  Feb 21 '12 at 21:26
  • When you say "use elementary matrices"... what is it you know about the connection between elementary matrices and invertible matrices? What do you know about invertible matrices? – Arturo Magidin Feb 21 '12 at 21:52
  • @ArturoMagidin: The authors haven't defined invertible matrices yet. –  Feb 21 '12 at 21:54
  • I don't know what you have learned so far. If you learned "Rank-nullity theorem", then it can be done by that. Consider the nullity of $B$, it is 0. So $B$ has the full rank. Then use $(BA-I)B=0$. – Sungjin Kim May 15 '13 at 02:01

6 Answers6

2

I will give a sketch of a proof. Let $A= \left( \begin{array}{cc} a & b \\ c & d \end{array} \right) $ and $B= \left( \begin{array}{cc} x & x \\ z & w \end{array} \right) $ such that $AB=I.$ Then we get $\left\{\begin{array}{c} ax + bz = 1 \\ cx + dz = 0 \\ \end{array}\right.$ and $\left\{\begin{array}{c} ay + bw = 1 \\ cy + dw = 0 \\ \end{array}\right.$

I will assume that $a\neq 0$ (since there is no $B$ such that BO=I.) Then we have $x=\frac{1}{a}-\frac{bz}{a}$ and we get $(ad-bc)z=-c$. Let suppose that $ad=bc$. If $b=0$ or $c=0$ then $d=0$ and we would have $A= \left( \begin{array}{cc} a & b \\ 0 & 0 \end{array} \right)$, or $A= \left( \begin{array}{cc} a & 0 \\ c & 0 \end{array} \right)$, or $A= \left( \begin{array}{cc} a & 0 \\ 0 & 0 \end{array} \right)$ but in any case there is no $B$ such that $BA=I$ (It is easy to prove that). So we have $(a,b,c,d)\neq (0,0,0,0)$. Then we have $a=\frac{bc}{d}$, but in this case the systems above do not have solution. Then $ad-bc\neq 0$ and we get $z=\frac{-c}{ad-bc}$. In the end we will find that $B= \frac{1}{ad-bc}\left( \begin{array}{cc} d & -b \\ -c & a \end{array} \right).$ It is easy to check that $BA=I.$ Now if $a=0$ then we have $b\neq0$ and $\dots$

I don't know how to solve the exercise in a different way. This is my best effort.

2

I know this is old, but I think I have found the answer that was intended. I also struggled with this one for a while because, as spohreis mentioned, you don't have much to go on at the time this is asked (no determinants, no transposes, no inverses even).

That being said, in problem 3 of section 1.4 you prove that all $2\times 2$ row-reduced echelon matrices are of the following form:

$$ \left[ \begin{array}{cc} 0 & 0 \\ 0 & 0 \end{array} \right]\quad,\quad \left[ \begin{array}{cc} 0 & 1 \\ 0 & 0 \end{array} \right]\quad,\quad \left[ \begin{array}{cc} 1 & c \\ 0 & 0 \end{array} \right]\quad,\quad \left[ \begin{array}{cc} 1 & 0 \\ 0 & 1 \end{array} \right] \,.$$

Now assume that $A$ and $B$ are $2 \times 2$ matrices such that $AB=I$. By theorem 5 (pg 12) we have that $B$ is row-equivalent to a row-reduced echelon matrix $R$, and by the corollary to theorem 9 (pg 20) this implies that $B=PR$ where $P$ is a product of elementary matrices. Similarly we have that $A=QT$ (where $Q$ is a product of elementary matrices and $T$ is in row-reduced echelon form).

Now we have that $AB=I \implies QTPR=I$, but now clearly $T=I$ because if the bottom row of $T$ were all zeros then the bottom row of $TPR$ would be zero, and this implies that the product $QTPR$ would have the form $$\left[ \begin{array}{cc} aQ_{11} & bQ_{11} \\ aQ_{21} & bQ_{21} \end{array} \right]$$ for some $a,b\in F$ and thus clearly could not be $I$. A similar argument shows that $R=I$. Thus $A$ and $B$ are both actually products of elementary matrices. By theorem 2 (pg 7) each elementary row operation has an inverse and using theorem 9 (pg 20) each elementary matrix therefore has an inverse, now we can write

$$\begin{align} AB=QP=E_{q_1}E_{q_2} \cdots E_{q_t}E_{p_1} \cdots E_{p_s}&=I\\ E_{q_1}^{-1}E_{q_1}E_{q_2} \cdots E_{q_t}E_{p_1} \cdots E_{p_s}E_{q_1}&=E_{q_1}^{-1}E_{q_1}=I\\ &\vdots\\ E_{p_1} \cdots E_{p_s}E_{q_1} \cdots E_{q_t}&=I\\ PQ&=I\\ BA&=I\,. \end{align}$$

(Note that at the end here, although I chose to use the standard inverse notation, it really is enough that such a matrix exists, which follows from theorem 2 and theorem 9 alone - no need to really "know" about inverse matrices yet. You could, if you so chose, just use theorem 9 to rewrite $QP=I$ in terms of elementary row operations, and then just use theorem 2 directly without ever mentioning inverse matrices.)

mboratko
  • 4,553
1

$(BA)B = B(AB) = BI = B$. Thus, $(BA - I)B = 0$. Suppose there exists $X\neq O$ such that $XB = 0$, then there is a non-trivial combination of rows of $B$ resulting in zero. If that were true, say, $B = \begin{pmatrix} a & b \\ ka & kb \end{pmatrix}$, then all row combinations of $B$ will be multiples of $(a,b)$. However, $(1,0)$ and $(0,1)$ can not simultaneously be multiples of $(a,b)$. Thus no matrix $A$ will satisfy $AB = I$. The contradiction results from our assumption on the existence of $X$. Therefore, $X$ has to be $O$, which means $BA = I$.

Bo Liu
  • 151
  • 8
0

$AB= I$, $Det(AB) = Det (A) . Det(B) = 1$. Hence $Det(B)\neq 0$ Hence $B$ is invertible.

Now let $BA= C$ then we have $BAB= CB$ which gives $B= CB$ that is $B. B^{-1} = C$ this gives $ C= I$

zapkm
  • 1,896
  • 13
  • 20
0

Here we explain how to derive entries of $B$ from the entries of $A$. As the book explains in the beginning of page 19 (section 1.5), if we let $B_1$ and $B_2$ denote the first and second columns of the matrix $B$ then one can write $AB = [AB_1,AB_2]$. Hence AB = I if and only if $AB_1=\left[\begin{array}{c} 1\\0\\\end{array}\right]$ and $AB_2=\left[\begin{array}{c} 0\\1\\\end{array}\right]$. So we know that the two systems of two linear equations in two unknowns $AX=\left[\begin{array}{c} 1\\0\\\end{array}\right]$ and $AY=\left[\begin{array}{c} 0\\1\\\end{array}\right]$ are both having solutions. We would like to prove that the only solutions are $B_1$ and $B_2$, and we would like to find these solutions in terms of entries of $A$.

If we let $A= \left[ \begin{array}{cc} a & b \\ c & d \end{array} \right]$ then the two equations can be written as $\left[ \begin{array}{ccc} a & b & 1\\ c & d & 0 \end{array} \right]$ and $\left[ \begin{array}{ccc} a & b & 0\\ c & d & 1 \end{array} \right]$. Since both these equations have solutions ($B_1$ and $B_2$), then the equations $\left[\begin{array}{ccc} ad-bc & 0 & d\\ 0 & ad-bc & -c \end{array}\right]$ and $\left[ \begin{array}{ccc} ad-bc & 0 & -b\\ 0 & ad-bc & a \end{array} \right]$, which are linear combinations of the original equations, are also having solutions (section 1.2 of the book). Now based on explanation in page 14, which explains the conditions for which non-homogenous systems of equations have solutions, if $ad-bc=0$ then $a=b=c=d=0$. This is a contradiction since if $A=0$, then $AB=0B=0\neq I$. So we can assume that $ad-bc \neq 0$. This immediately proves that the two equations are having unique solutions, therefore $B$ can only be of the following form $$\left[ \begin{array}{cc} \frac{d}{ad-bc} & \frac{-b}{ad-bc} \\ \frac{-c}{ad-bc} & \frac{a}{ad-bc} \end{array} \right].$$

Now using this form for $B$, one can verify that $BA = I$.

-1

$AB = I$ implies that $ABAB = I$ and $AABB = A(I)B = AB = I$, hence $ABAB = AABB$. Since $\det(AB) = \det(A)\det(B) = 1$, the determinant of $A$ and the determinant of $B$ are units, so that $A$ and $B$ have inverses (using the adjoint matrix thing), hence $AABB = ABAB$ implies $BA=AB = I$.

Hope that helps,

Note : This proof works when $\mathbb F$ is an arbitrary commutative ring with unity. You didn't specify what $\mathbb F$ was so that I'm stating in which generality this proof holds.

  • I don't know why you specify $2 \times 2$ matrix, but perhaps it is for explicitly using the adjoint formula for the inverse. – Patrick Da Silva Feb 21 '12 at 19:33
  • Because I cannot use $\det$ here. I can use very little indeed. Determinants will be defined in the chapter $5$. I am trying to solve the exercises by using what I have. Thanks! –  Feb 21 '12 at 19:44
  • Hmm. Then perhaps you should've mentioned that, it would have made my answer different. – Patrick Da Silva Feb 21 '12 at 19:44
  • I did! I wrote: I am supposed to use only elementary matrices to solve this question. –  Feb 21 '12 at 19:46
  • Then perhaps I should've read better. =P – Patrick Da Silva Feb 21 '12 at 19:54
  • If you are already using that the determinant of $A$ and $B$ being a unit implies the matrices have inverses, then why bother with the rest of the first paragraph? If $A$ has an inverse $A^{-1}$, and $AB=I$, then $B=IB = A^{-1}AB = A^{-1}I = A^{-1}$, and hence by definition of "inverse" you conclude $BA=A^{-1}A = I$. – Arturo Magidin Feb 21 '12 at 21:49
  • Because I am stuck in proofs involving integral domains so I took the habit of not using $A^{-1}$ where I don't need to. In this context this habit is not useful, but, oh well... I guess I should start considering units one way. – Patrick Da Silva Feb 22 '12 at 01:57
  • A downvote? Please. There's nothing wrong about this answer. If no one starts talking I'm gonna stop answering on this site just because I'm gonna be pissed. – Patrick Da Silva Feb 22 '12 at 07:01
  • The downvote was from me. Though what you have written is mathematically correct, it does not answer the OP's question as stated. – Alf Feb 22 '12 at 21:20
  • Thanks for the comment. You should learn though to comment before downvoting and explain why you're downvoting. If you don't it just gets people angry and your downvote is pointless. – Patrick Da Silva Feb 23 '12 at 04:11