2

I met a problem this morning.

Suppose that there exists a full rank matrix $X = \begin{bmatrix} a & b & c \\ b & c & b\\ c & d & a \end{bmatrix} \in \mathbb{R}^{3\times3}$. From the definition of linear independence we know that if the matrix is not a full rank matrix, a relationship would exist: $k_1X_1 + k_2X_2 = X_3$, where $X_1$, $X_2$ and $X_3$ are corresponding columns of $X$, and $k_1$, $k_2 \in \mathbb{R}\backslash0$.

If $k_1$ and $k_2$ exist, then: $$\begin{bmatrix} X_1 & X_2\end{bmatrix} \begin{bmatrix} k_1 \\ k_2\end{bmatrix} = \begin{bmatrix} X_3 \end{bmatrix}$$ $$ \begin{bmatrix} k_1 \\ k_2\end{bmatrix} = pinv (\begin{bmatrix} X_1 & X_2\end{bmatrix}) \begin{bmatrix} X_3 \end{bmatrix}$$ $$ \begin{bmatrix} k_1 \\ k_2\end{bmatrix} = (\begin{bmatrix} X_1 \\X_2\end{bmatrix} \begin{bmatrix} X_1 & X_2\end{bmatrix})^{-1}\begin{bmatrix} X_1\\ X_2\end{bmatrix} \begin{bmatrix} X_3 \end{bmatrix}$$

And I try some simulations in MATLAB, the $k_1$ and $k_2$ exist. But the first equation is not satisfied.

So I am very confused. I must be wrong. But where? Could anyone tell me?

2 Answers2

1

It seems you have solved a least square problem finding the projection of $X_3$ onto $span(X_1,X_2)$.

Refer to the related Is the pseudoinverse matrix the solution to the least squares problem?

user
  • 154,566
  • Yes, this is exactly the least square problem. But why $k_1$ and $k_2$ exist? Since this violates the definition of linear independence – stander Qiu Sep 21 '18 at 08:01
  • $k_1$ and $k_2$ are the solutions of the least squares problem which always has a unique solution since $X_1$ and $X_2$ are linearly independent. – user Sep 21 '18 at 08:28
  • I mean, $k_1$ and $k_2$ are not zero. If $X_1$ and $X_2$ are linearly independent, the coefficients should be zeroes. – stander Qiu Sep 21 '18 at 08:55
  • It is not true for a least square problem. We are solving $k_1X_1+k_2X_2=\bar X_3$ where $\bar X_3$ is projection of $X_3$ onto $span(X_1,X_2)$ and therefore in general $k_1$ and $k_2$ are not zero unless $X_3$ is orthogonal to $span(X_1,X_2) $. – user Sep 21 '18 at 10:33
  • I see. I understand my wrong. One more question: how could I know that whether the matrix is full rank or not via matrix form? – stander Qiu Sep 21 '18 at 12:41
  • @standerQiu As always by $\det(A)$ for "small" matrices or by RREF. – user Sep 21 '18 at 12:53
  • See. Thank you very much! – stander Qiu Sep 22 '18 at 06:25
  • @standerQiu You are welcome! Bye – user Sep 22 '18 at 06:50
-2

Your characterization of linear dependence is incorrect. The columns of $X$ are linearly dependent iff there is a nontrivial linear combination $k_1X_1+k_2X_2+k_3X_3$ that vanishes, but this does not let you conclude that $k_1X_1+k_2X_2=X_3$. Consider, for example, $$X=\begin{bmatrix}1&0&0\\0&0&0\\0&0&1\end{bmatrix}.$$ The columns are linearly dependent (every set that contains the zero vector is), but the third column is obviously not a scalar multiple of the first.

If the columns are linearly dependent, then you know that at least one of them is a linear combination of the other two, but you can’t a priori decide which one that is.

amd
  • 53,693
  • Why the downvote, o drive-by voter? – amd Sep 21 '18 at 07:58
  • $k_1X_1+k_2X_2+k_3X_3$ vanishes means that $k_1X_1+k_2X_2 = X_3$ (scaling) – stander Qiu Sep 21 '18 at 08:04
  • @standerQiu No, it doesn’t. Take another look at the simple counterexample above. What value of k1 gives k1(1,0,0)=(0,0,1)? Your “scaling” in that example divides by zero. You’re making the unwarranted assumption that $k_3\ne0$. – amd Sep 22 '18 at 21:15