Say I have $3$ vectors of $\mathbb R^2$: $(-2,1), (1,3), (2,4)$. What do I have to do in order to show that these $3$ vectors of $\mathbb R^2$ are linearly independent or dependent? Thanks
-
Do you recall what the definition of linear independence is? – Plopperzz Oct 16 '16 at 04:41
-
1Try to write $(0,0)$ (that's the zero vector) as a linear combination of your given 3 vectors. You will see that this can be done in a non trivial way. Conclusion? – imranfat Oct 16 '16 at 04:42
-
You want to take linear combinations of them and see if there is a combination that is not all zeros. So take x(-2,1) + y (1,3)= (0,0) and solve the system of equations. If (x,y)=(0,0) then the vectors are linearly independent. then do it for all pairs. It should be obvious though that any three vectors in $\Bbb{R}^2$ will be linearly dependent. – Mike Oct 16 '16 at 04:43
3 Answers
For linear independence you have to show that if $a(-2,1)+b(1,3)+c(2,4)=(0,0)$ then $a=b=c=0$. But, in this particular case we have $\dim\mathbb{R}^2=2$, and if these three vectors were linearly independent then you could extend them to form a basis, let's say $B$, so $\mathrm{card}(B)\geq 3$, which contradicts $\dim\mathbb{R}^2=2$.
You do not need to do anything. If you have $R^n$ space and have $k > n$ vectors, then at least one of them would be dependent from others

- 2,645
Let $v_1,v_2,v_3\in\mathbb{R}^2$ be any three arbitrary vectors.
Define a matrix $A\in\mathbb{R}^{2\times 3}$ as follows.
$$A:=\begin{bmatrix} \biggl |& \biggl|&\biggl|\\v_1&v_2 &v_3\\\biggl|&\biggl|&\biggl|\end{bmatrix}$$
Let $\Psi(A)$ denote the rank of the matrix $A$. We have $\Psi(A)\le\min\{2,3\}=2$. But we know that $\dim \mathscr{N}(A)+\dim \mathscr{R}(A)=3$, where $\mathscr{N}(A)$ and $\mathscr{R}(A)$ denote the null space and row space of $A$ respectively. Now using the fact that $\dim \mathscr{R}(A)=\Psi(A)$, we may write $ \dim \mathscr{N}(A)+\Psi(A)=3$, and hence $\dim \mathscr{N}(A)= 3-\Psi(A)$. Now because $\Psi(A)\leq 2$, we have $\dim \mathscr{N}(A)\ge 3-2=1$. In other words, the dimension of the null space of matrix $A$ is at least $1$. Therefore $x\in\mathscr{N}(A)$ for some $x\neq\mathbf{0}$, and hence $Ax=0$ for some $x\neq\mathbf{0}$. But $Ax$ is the linear combination of the column vectors $v_1,v_2,v_3$, and therefore there exists a non-zero combination of $v_1,v_2,v_3$ which gives $\mathbf{0}$. In other words, there exists $k_1,k_2,k_3$, not all of which are $0$, such that $\sum_{j} k_j v_j=0$. Thus by the definition of linear dependence, the vectors $v_1,v_2,v_3$ are linearly dependent.

- 1,202