I know that this might be a pedantic question, but my course materials have always been referring to the orthogonal group as the set $O_n = \{Q \in \mathbb{R}^{n\times n}\mid Q^TQ = I\}$ while also (at least first) giving the logical definition: "the set of matrices whose inverse is their transpose". Most online materials that I've encountered require the set to be $O_n = \{Q \in \mathbb{R}^{n\times n}\mid Q^TQ = QQ^T = I\}$, which is understandable as the definition of an orthogonal group is the set of matrices whose inverse is their transpose. What I'm wondering is that how could (or could one even) show that $QQ^T = I$, since strictly speaking $Q^TQ = I$ reads as "$Q$'s transpose is its left inverse and $Q$ is its transposes right inverse".
Asked
Active
Viewed 58 times
0
-
If $A$ and $B$ are square matrices with $AB=I$, then we have $BA=I.$ Try a proof ! – Fred Mar 14 '22 at 08:41
-
@Fred Does it follow from $\det(AB) = \det(A)\det(B) = 1 \implies \det(A)\neq 0\neq \det(B)$ which implies directly that $A = B^{-1}$? – Cartesian Bear Mar 14 '22 at 08:43
-
I don't get your argument... do you claim that $\det(A)=\det(C)\implies A=C$ ? Anyway, If $AB=I$ and that $A$ and $B$ are square matrices, then $A$ and $B$ are invertible and the inverse is unique. Therefore $BA=I$ must hold. – Surb Mar 14 '22 at 08:51
-
@Surb I think that we have the same argument: I meant that $\det(AB) = \det(A)\det(B) = 1$ implies that both $A$ and $B$ have to be invertible, i.e. $A = B^{-1}$. – Cartesian Bear Mar 14 '22 at 09:52
-
@Surb (and also Sick Series), if there's anything to prove here, then we can't use that argument. Given appropriate matrices $X$ and $Y$, saying that $XY$ is the identity matrix doesn't (immediately) say that either $X$ or $Y$ have an inverse and that their inverses are each other. Rather, it says $Y$ is a left-inverse of $X$ and $X$ is a right-inverse of $Y$. It requires proof to claim that "side-inverses" imply invertibility. – Git Gud Mar 14 '22 at 10:40
-
2@SickSeries Your question follows as an easy corollary of what you can find here. – Git Gud Mar 14 '22 at 10:41
-
2As pointed out in a comment above, the statement is a special case of $AB=I$ iff $BA=I$ in a finite-dimensional vector space. It doesn't hold in infinite-dimensional inner product spaces (classical counterexample: $LR=I$ but $RL\ne I$ where $L$ and $R=L^T$ denote the left and right shifts of sequences of real numbers with finitely many nonzero terms, and the inner product is the usual dot product). Hence its proofs must somehow involve the dimension of the vector space. – user1551 Mar 14 '22 at 15:27