1

Suppose I have the following.

$$A= \begin{bmatrix} 1 & -2 & 0 & 3 & 2 \\ 2 & -4 & 1 & 2 & 5 \\ 1 & -2 & 1 & -1 & 3 \\ 3 & -6 & 2 & 1 & 8 \\ \end{bmatrix}\longrightarrow R=\begin{bmatrix} 1 & -2 & 0 & 3 & 2 \\ 0 & 0 & 1 & -4 & 1 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \\ \end{bmatrix} .$$

I can see columns 1 and 3 of R must be linearly independent. But how does this imply that the first and third columns of $A$ are linearly independent if row operations don't preserve the column space?

I know that if we get $A^{T}$ into upper triangular form we obtain $$ \begin{bmatrix} 1 & 2 & 1 &3 \\ 0 & 1 & 1&2\\ 0 & 0 &0&0\\ 0 & 0 &0&0\\ 0 & 0 &0&0\\ \end{bmatrix} $$ which implies that cols 1 and 3 of A are independent. This is because row operations in the transpose of A preserve its row space, which is the original column space of A, but this doesn't show how RREF preserves linear dependence of the columns.

Alex D
  • 1,214
  • Row operations don’t preserve the column space, but they do preserve the dimension of the column space. That’s because row operations preserve the row space, and the dimension of the row space equals the dimension of the column space. – symplectomorphic Oct 24 '18 at 20:39
  • See https://math.stackexchange.com/questions/332908/looking-for-an-intuitive-explanation-why-the-row-rank-is-equal-to-the-column-ran and https://ocw.mit.edu/courses/mathematics/18-701-algebra-i-fall-2010/study-materials/MIT18_701F10_rrk_crk.pdf – symplectomorphic Oct 24 '18 at 20:43

1 Answers1

1

If $R$ is the RREF of the matrix $A$, then you can write $$ R=FA $$ where $F$ is invertible. This is one of the main points in row reduction. Now let's write $A=\begin{bmatrix} a_1 & a_2 & \dots & a_n \end{bmatrix}$ and $R=\begin{bmatrix} r_1 & r_2 & \dots & r_n \end{bmatrix}$ ($a_i$ and $r_i$ the columns of $A$ and $R$). Therefore, by definition of matrix product, $$ r_i=Fa_i \qquad (i=1,2,\dots,n) $$

Suppose a column of $A$ can be written as a linear combination of other columns of $A$: $$ a_j=\alpha_1a_{i_1}+\dots+\alpha_ka_{i_k} $$ Then $$ r_j=Fa_j=F(\alpha_1a_{i_1}+\dots+\alpha_ka_{i_k})=\alpha_1Fa_{i_1}+\dots+\alpha_kFa_{i_k}=\alpha_1r_{i_1}+\dots+\alpha_kr_{i_k} $$ Similarly you can go from linear relations between columns of $R$ to the same linear relation between the corresponding columns of $A$, by using $F^{-1}$.

Since a set of vectors is linearly dependent if and only if one of the vectors is a linear combination of the others, it follows that a set of column in $A$ is linearly independent if and only if the corresponding set of columns of $R$ is linearly independent.

Since the pivot columns in $R$ form a maximal linearly independent subset, the same holds for the corresponding columns of $A$.

We have even more: the entries in a nonpivot column in $R$ allow us to write it as a linear combination of the pivot column using precisely those entries as coefficients, we also know how to express a column of $A$ as a linear combination of the columns corresponding to the pivot columns.

In your case, we have $r_4=3r_1-4r_3$, so also $a_4=3a_1-4a_3$ as you can verify directly.

egreg
  • 238,574