2

$\newcommand{\sp}{\operatorname{sp}}$

Let $V$ be a vector space over $F$ field, and let $A,B$ be two different, disjoint, non empty sets of vectors from $V$.

If $\sp(A) \cup \sp(B)=\sp(A\cup B) \Rightarrow A\cup B$ is linearly dependent

I've started by saying that if $\sp(A) \cup \sp(B)=\sp(A\cup B)$ then $\sp(A) \subseteq \sp(B)$ or $\sp(B)\subseteq \sp(A)$.

Thus if we assume, WLOG, that $\sp(A) \subseteq \sp(B)$ and we take $v_1 \in A$ and multiply it by scalars $\alpha_1,\ldots,\alpha_k$ we'll get a vector from $B$.

Thus $ A\cup B$ is linearly dependent.

I feel like this proof is not good enough, where is it failing?

Georgey
  • 1,589
  • 2
  • 25
  • 39
  • What is $sp$? Why WLOG? – Jan May 15 '13 at 12:56
  • I assume $A$ and $B$ are sets of vectors and $sp$ is there linear span, right? However, you should really define the symbols you use (at least informally). – Elmar Zander May 15 '13 at 12:59
  • 2
    Think about this: the span of $A\cup B$ is a vector space. The union of the span of $A$ and the span of $B$ is a union of two vector spaces. Under what circumstances can the union of two vector spaces be a vector space? – Gerry Myerson May 15 '13 at 13:01
  • @Jan, edited, sorry for misinformation – Georgey May 15 '13 at 13:02
  • @ElmarZander you're right, I totally forgot to copy it. – Georgey May 15 '13 at 13:03
  • @GerryMyerson, add that comment as answer, please...or I'll steal it from you! +1 – DonAntonio May 15 '13 at 13:09
  • 1
    @Gerry, the union of spans is a vector space iff one is a subset of the other, right? – Georgey May 15 '13 at 13:10
  • @Don, I started typing it as an answer, then decided it was really more of a hint as to another way to do the problem, so I made it a comment. But you are welcome to post it as an answer. – Gerry Myerson May 15 '13 at 13:11
  • @Georgey Yes, in general if $W_1$ and $W_2$ are subspaces then $W_1 \cup W_2$ is a subspace iff $W_1 \subset W_2$ or $W_2 \subset W_1$. See my proof here: http://math.stackexchange.com/questions/71872/union-of-two-vector-subspaces-not-a-subspace/385597#385597 –  May 15 '13 at 13:24
  • @BryanUrízar I actually proved that in a former home exercise more or less like Gerry did there. But I'm having a hard time connecting it to this proof. Maybe because there we're just talking about vector spaces and now we're talking about spans which is a new and less clear definition for me. – Georgey May 15 '13 at 13:26
  • @Georgey Yes, I know what you mean. I'm also learning this material right now and I find myself looking aback at the definitions very often because I confuse myself very easily. –  May 15 '13 at 14:45

3 Answers3

3

We use contraposition at this proof:

suppose $A\cup B$ be linear independence then elements of A as $\{\alpha_1,..,\alpha_n\}$ with elements of B $\{\beta_1,...\beta_m\}$ will be linear inedependence so $\alpha_1+....+\alpha_n+\beta_1+....\beta_m \in span(A\cup B)$ but it is not belong to span(A)$\cup$span(B)

Somaye
  • 2,730
  • the $\alpha$'s and $\beta$'s are vectors right? – Georgey May 15 '13 at 13:12
  • \alpha 's vectors in A and \beta 's vectors in B – Somaye May 15 '13 at 13:20
  • and you're saying that the equation isn't true, am I getting you right? I'm not quite sure how to read your proof as you started with what is needed to be proven. – Georgey May 15 '13 at 13:22
  • when you want proof a relation you can proof anti revese relation ineasetd of it !here if you proof (AUB is independend then sp(A) \cup sp(B)=sp(A\cup B) \Rightarrow A\cup B will not stisfye ) you will proof our relation – Somaye May 15 '13 at 13:36
  • As I was saying, I had a hard time reading your proof, You could have mentioned that you're using contraposition. – Georgey May 15 '13 at 13:37
  • excuse me George you are right contraposition – Somaye May 15 '13 at 13:41
1

Here is a direct proof:

To prove that $A \cup B$ is linearly dependent I must show that there exists a finite number of distinct vectors $w_1, \ldots,w_n$ in $A \cup B$ and scalars $c_1,\ldots,c_n$ in $F$, not all zero, such that

$$ c_1w_1 + \dots + c_nw_n = 0.$$

As already mentioned, span$(A) \cup$ span$(B) =$ span$(A \cup B)$ implies span$(A) \subset$ span$(B)$ or span$(B) \subset$ span$(A)$. I'll assume the former.

Let $x \neq 0 \in$ span$(A)$. Therefore, we can write $x = a_1v_1 + \dots + a_sv_s$ for some $v_1,\dots,v_s$ in $A$ and for scalars $a_1,\dots, a_s$ in $F$. By the inclusion above, we have that $x \in$ span$(B)$ and similarly, $x = b_1u_1 + \dots + b_tu_t$ for $u_1,\ldots,u_t \in B$ and $b_1,\ldots, b_t \in F$. Note then that we have

$$0 = x - x = (a_1v_1 + \dots + a_sv_s) - (b_1u_1 + \dots + b_tu_t)$$

Which is a linear combination of distinct vectors (as $A$ and $B$ were disjoint) of $A \cup B$ and the scalars $a_1,\ldots,a_s, b_1, \ldots, b_t$ are not all zero as $x$ was non-zero. Therefore, $A \cup B$ is linearly dependent.

0

Your idea is good, but not expressed in the best way.

You can assume, as you did, $\def\sp{\mathrm{sp}}\sp(A)\subseteq\sp(B)$. If the only vector in $A$ is $0$, you have nothing to prove, because any set containing the zero vector is linearly dependent. Otherwise, take $v\in A$, $v\ne0$. Then you can write $$ v=\alpha_1w_1+\alpha_2+\dots+\alpha_nw_n $$ for suitable $w_1,w_2,\dots,w_n\in B$ and so $A\cup B$ is not linearly independent.

egreg
  • 238,574
  • Why can you assume that $B$ is finite? –  May 15 '13 at 16:13
  • @BryanUrízar It's not really needed, it's easy to fix. – egreg May 15 '13 at 16:44
  • After reading some more today I came across this theorem: If $S_1 \subset S_2$ and $S_1$ is linearly dependent then $S_2$ is linearly dependent. So your answer before the edit was perfectly fine too. –  May 16 '13 at 14:04