0

I have came up with a proof that any 3 vectors, $\mathbf u$, $\mathbf v$, $\mathbf w$, are always linearly dependent in 2D. I'm a bit of a noob when it comes to linear algebra, so I'm wondering if my proof is good or not. I know that questions like this were already asked before(this and this), but I haven't come across a satisfactory answer for my specific proof. Also, many of the answers delve into matrices which I have not learned about yet.

Proof

Let $a,b,c$ be scalars. We can setup the following equation to check for linear dependence: $$a\mathbf u+b\mathbf v+c\mathbf w=0\tag1$$ There are two cases to consider from this point.

Case $1$: $\mathbf u$ and $\mathbf v$ are linearly independent.
In this case, we can rewrite $(1)$ as follows: $$a\mathbf u+b\mathbf v=-c\mathbf w\tag2$$ Because $\mathbf u$ and $\mathbf v$ are linearly independent, any vector can be formed from a linear combination of $\mathbf u$ and $\mathbf v$. Because $a\mathbf u+b\mathbf v$ represents a linear combination of $\mathbf u$ and $\mathbf v$, the vector $-c\mathbf w$ can be formed no matter what value $c$ is. Setting $c$ to be non-zero ensures that at least one of the scalars are non-zero while satisfying $(2)$. Therefore, because there exists at least one solution that is not $a=b=c=0$, the 3 vectors are linearly dependent in this case.

Case $2$: $\mathbf u$ and $\mathbf v$ are linearly dependent. In this case, it follows that the equation $a\mathbf u+b\mathbf v=0$ is guaranteed to be satisfied with some non-zero scalars $a$ and $b$. We can simply let $c=0$ to convert $(1)$ to $a\mathbf u+b\mathbf v=0$. Because at least one of $a$ or $b$ are non-zero, we have proven that there is at least one solution that is not $a=b=c=0$. Therefore, the 3 vectors are linearly dependent in this case.

With the two cases satisfied, the proof is complete.

Arctic Char
  • 16,007
Aiden Chow
  • 2,849
  • 10
  • 32
  • What, exactly, is $2D$? – Arctic Char Aug 13 '21 at 17:50
  • @ArcticChar Uh, is it not clear that it is 2 dimensions? If I put $\mathbb R^2$, would it be clearer? – Aiden Chow Aug 13 '21 at 17:52
  • @Aiden Chow try by contradiction and suppose these three vectors were linearly independent: then the space would have dimension at least three. – lorenzo Aug 13 '21 at 17:52
  • @lorenzo I was specifically asking about the proof I provided, but I could try that way as well. – Aiden Chow Aug 13 '21 at 17:56
  • If by $2D$ you mean a 2-dimensional vector space, then you will need to show why, in case 1, $u$, $v$ spans the vector space. – Arctic Char Aug 13 '21 at 18:16
  • @ArcticChar Are you saying that I have to supply a proof for this statement? "Because $\mathbf u$ and $\mathbf v$ are linearly independent, any vector can be formed from a linear combination of $\mathbf u$ and $\mathbf v$." – Aiden Chow Aug 13 '21 at 18:20
  • Yes. I guess my complain is that I can't see how $2D$ is used in your attempt (This is the only place where this is used, I suppose). – Arctic Char Aug 13 '21 at 18:41
  • @ArcticChar Is my proof flawed? Do I need to rewrite it in any way? – Aiden Chow Aug 13 '21 at 19:00
  • Everything is fine except that I see no justification why "any vector can be formed from a linear combination of u and v". If you add a correct explanation your proof is correct. – Arctic Char Aug 13 '21 at 19:13
  • 1
    @ArcticChar I'm not sure how I can prove that. The teacher I'm learning from just said that if two vectors are linearly independent, they will be pointing in different directions. So scaling them accordingly, you will be able to reach all vectors. Is this enough of a proof or do I need to be more rigorous? If I need to be more rigorous, how would I start with the proof(I have no idea)? – Aiden Chow Aug 13 '21 at 22:17

1 Answers1

1

You say "Because $u$ and $v$ are linearly independent, any vector can be formed from a linear combination of $u$ and $v$.". But it is easy to show that this statement is equivalent to what you are trying to prove, so you have committed circular reasoning.

I will show the equivalence now. To show the forward direction, assume that $$u, v \in \mathbb{R}^2 \text{ linearly independent } \implies \text{ every vector in } \mathbb{R}^2 \text{ is a linear combination of } u, v.$$ Let $u, v, w \in \mathbb{R}^2$ be arbitrary. We need to show that $u, v, w$ are linearly dependent. If $u$ and $v$ are linearly dependent, then we are done, so assume that $u, v$ are linearly independent. By our assumption, there exist $a, b \in \mathbb{R}$ such that $au + bv = w$. Thus $au + bv - w = 0$. Since this is a linear combination of $u, v, w$ with not all coefficients $0$, $u, v, w$ are linearly dependent.

To show the reverse direction, assume that $$u, v, w \in \mathbb{R}^2 \implies u, v, w \text{ are linearly dependent}.$$ Let $u, v \in \mathbb{R}^2$ be arbitrary linearly independent vectors. We need to show that every $w \in \mathbb{R}^2$ is a linear combination of $u$ and $v$. So let $w \in \mathbb{R}^2$ be arbitrary. By our assumption, $u, v, w$ are linearly dependent. Thus there exist $a, b, c \in \mathbb{R}$ not all zero such $au + bv + cw = 0$. Note that $c \neq 0$ since if $c = 0$, then one of $a, b$ would be nonzero, and that would mean $u, v$ are linearly dependent. Thus $w = -\frac{a}{c}u -\frac{b}{c}v$, finishing the proof.

Mason
  • 10,415
  • "committed circular reasoning" was quite a dramatic way to phrase this – C Squared Aug 14 '21 at 05:05
  • Um, big noob at linear algebra here, I'm not too sure what $\operatorname{span}$ is, and I also don't see where the circular reasoning is. Could you elaborate a little more(maybe with simpler notation/terminology)? – Aiden Chow Aug 14 '21 at 05:52
  • @AidenChow I've removed all terminology and proved that your assumption is equivalent to what you are trying to prove. – Mason Aug 14 '21 at 18:08