2

The question is self-explanatory. But as an example (this is from Strang's lectures), take the following Markov matrix M.

$M=\begin{bmatrix} .1 & .01 & .3 \\ .2 & .99 & .3 \\ .7 & 0 & .4 \end{bmatrix}$

Here, the columns are clearly dependent since each adds up to 1. However, in the lectures Strang mentions the fact that this implies that the rows are also linearly dependent. Could someone explain why that is the case in general?

Bosco
  • 167
  • 5
    This is a consequence of "row rank = column rank" which is not obvious: see https://math.stackexchange.com/questions/332908/looking-for-an-intuitive-explanation-why-the-row-rank-is-equal-to-the-column-ran . – Qiaochu Yuan Sep 22 '22 at 18:42
  • 1
    The statement as given in the title is false. For example, the following matrix has linearly independent rows, but linearly dependent columns: $\left(\begin{array}{ccc} 1&1&1\2&2&1\end{array}\right)$. – Arturo Magidin Sep 22 '22 at 18:53
  • @ArturoMagidin Do you think that the statement would turn true if we restrict it to square matrices? – Bosco Sep 22 '22 at 18:55
  • 1
    @Bosco: I don't "think", I know. As Qiaochu says, it is a consequence of the fact that the column rank (largest number of linearly independent columns) is equal to the row rank (largest number of linearly independent rows) in any matrix. – Arturo Magidin Sep 22 '22 at 19:00

0 Answers0