How it can be proved that a matrix whose columns are linearly independent such as a basis matrix that spans a space is invertible?
2 Answers
Assuming you are talking about square matrices over a field $K$, the following applies. Any $n$ linearly independent elements of $K^n$ are a basis of $K^n$, as they span a subspace of dimension$~n$. This applies in particular to the columns $C_1,\ldots,C_n$ of our matrix$~A$. Being a basis, we can express each column $E_j$ of the identity matrix $I_n$ as a linear combination: $$ E_j=b_{1,j}C_1+b_{2,j}C_2+\cdots+b_{n,j}C_n \qquad\text{for $j=1,2,\ldots,n$.} $$ Then $$ I_n=\begin{pmatrix} C_1&C_2&\ldots&C_n\end{pmatrix}\cdot\begin{pmatrix} b_{1,1}&b_{1,2}&\ldots&b_{1,n}\\ b_{2,1}&b_{2,2}&\ldots&b_{2,n}\\ \vdots&\vdots&\ddots&\vdots\\ b_{n,1}&b_{n,2}&\ldots&b_{n,n}\\ \end{pmatrix} =A\cdot B. $$

- 117

- 115,048
-
I'm having trouble understanding why this works in reverse. As in, if you reversed the A and B matrices, how do you guarantee the identity matrix? – Zhouster Feb 16 '15 at 11:41
-
@Zhouster: I don't understand you question. The matrix $B$ is defined by the first system of equations (the one with "for $j=1,2,\ldots,n$"), and this system written in matrix form gives $AB=I_n$. There is no such thing as reversing $A$ or $B$ here. If in fact you are asking why $BA=I$, this is a general fact for square matrices: http://math.stackexchange.com/q/3852/18880. – Marc van Leeuwen Feb 16 '15 at 12:29
Fix a basis $\{e_i\}$. The given matrix is the realization of a linear operator which maps $e_i$ into columns of the matrix. Since the columns constitute a new basis, you can build a new operator which maps the columns into $\{e_i\}$. Its matrix is inverse to the given matrix.

- 17,470
-
As one is working in $K^n$, one need not choose a basis; doing so complicates things unnecessarily. The standard basis will do fine. – Marc van Leeuwen Oct 17 '13 at 14:54
-