37

Question

Suppose we want to find a basis for the vector space $\{0\}$.

I know that the answer is that the only basis is the empty set.

Is this answer a definition itself or it is a result of the definitions for linearly independent/dependent sets and Spanning/Generating sets? If it is a result then would you mind mentioning the definitions of bold items from which this answer can be deduced.

Useful Links

I found the links Link 1, Link 2, Link 3, Link 4,Link 5, and Link 6 useful for answering this question. It needs some elementary background form mathematical logic. You can learn it by spending a few hours on this Wikipedia page.

  • 7
    Here is a related question: How many nonzero vectors does the zero vector space have? (an element of a basis has to be a nonzero vector by definition) – Fabian Aug 31 '16 at 11:53
  • @Fabian: Would you provide the link please? :) – Hosein Rahnama Aug 31 '16 at 11:58
  • ...Link for what? – Fabian Aug 31 '16 at 11:58
  • @Fabian: Ah I thought you are referring to a question on MSE! :D Sorry! :) – Hosein Rahnama Aug 31 '16 at 11:59
  • It is a question for you: elements of a basis have to be nonzero vector (this is either a part of the definition or follows immediately from the linear independence property). How many nonzero vectors (which potentially could be part of the basis) does a zero vector space have? – Fabian Aug 31 '16 at 12:00
  • 1
    @H.R.: Related: http://math.stackexchange.com/questions/1812653 – Watson Aug 31 '16 at 12:02
  • @Fabian: Okay the answer to your question is nothing but does it justify ${}$ being a basis for ${0}$? I mean by answering your question what remains is to examine that ${}$ is a basis or not! Am I right? :) – Hosein Rahnama Aug 31 '16 at 12:09
  • 1
    Maybe I did not understand your question well: My comment was just intended to indicate that only ${}$ can potentially be a basis. Of course you then should show that it is indeed a basis (or you have already proven that any finite dimensional vector space indeed has a basis). – Fabian Aug 31 '16 at 12:11
  • @Fabian: Of course, I thank you as your question helped me to understand it is the only basis! :) – Hosein Rahnama Aug 31 '16 at 12:13
  • 1
    Vacuous statements are true. You will see this type of argument frequently occuring though pointless to me. – IAmNoOne Aug 31 '16 at 13:12
  • @Nameless: Good point! Would you please add some answer on that and how you usually deal with them. :) – Hosein Rahnama Aug 31 '16 at 13:15
  • 1
    @H.R. http://math.stackexchange.com/questions/734418/what-precisely-is-a-vacuous-truth – IAmNoOne Aug 31 '16 at 13:21
  • Related: http://math.stackexchange.com/questions/664594/why-mathbf0-vector-has-dimension-zero – JP McCarthy Aug 31 '16 at 14:36

4 Answers4

34

The standard definition of basis in vector spaces is:


$\mathcal B$ is a basis of a space $X$ if:

  • $\mathcal B$ is linearly independent.
  • The span of $\mathcal B$ is $X$.

You can easily show both of these statements are true when $X=\{0\}$ and $\mathcal B= \{\}$. Again, you have to look at the definitions:

  • Is $\{\}$ linearly independent? Well, a set $A$ is linearly independent if, for every nonempty finite subset $\{a_1,a_2\dots, a_n\}$, we have that if $$\alpha_1a_1 + \dots + \alpha_n a_n=0,$$ then $\alpha_i=0$ for all $i$. This condition is satisfied automaticall in the case of an empty set (everything follows from a false statement). This part may be difficult to understand, but since there is no nonempty finite collection of vectors from $\{\}$, any statement you say about nonempty finite collections of vectors from $\{\}$ must be true (because any such statement includes an assumption that a nonempty finite collection exists. It does not, meaning that any such statement is of the type $F\to A$ and is automatically true). This means $\{\}$ is linearly independent.

  • Is the span of $\{\}$ equal to $\{0\}$? Well, the span of a set $A\subseteq X$ is defined as the smallest vector subspace of $X$ that contains $A$. Since all vector subspaces contain $\{\}$, it is clear that $\{0\}$, which is the smallest vector subspace at all, must be the span of $\{\}$.


Alternatively, the span of $A$ is the intersection of all vector subspaces that contain $A$. Again, it should be obvious that this implies that the span of $\{\}$ is $\{0\}$.

5xum
  • 123,496
  • 6
  • 128
  • 204
  • @H.R. Related: http://math.stackexchange.com/questions/185255/ – Watson Aug 31 '16 at 12:03
  • @H.R. I expanded my answer. – 5xum Aug 31 '16 at 12:03
  • 4
    On the linearly independent issue, I would have said that the sum of zero terms is defined as $0$, so that the only subset, ${}$ give equation $0 = 0$ and we don't have to deal with ugly false statements in a proof. – Lærne Aug 31 '16 at 12:07
  • @Lærne Well, that avoids one problem while introducing another. So sure, it's another way, but I wouldn't say its either better or worse. – 5xum Aug 31 '16 at 12:15
  • What other problem ? It may be arbitrary, but it makes sense on many, many other contexts. – Lærne Aug 31 '16 at 12:23
  • @Lærne The problem that you then have another definition ("empty sum") that is conceptually difficult to understand (just as much as proof from a false statements). – 5xum Aug 31 '16 at 12:25
  • 2
    @Lærne Don't get me wrong, both definitions work, but none of them is trivial, so it's a matter of opinion which is best. – 5xum Aug 31 '16 at 12:26
  • Both work, I do not deny that. In my experience, people tends to understand better that a merging zero heaps of apples together gives you zero apples compared to understanding that anything can be deduced of contradictions, but you may have a better analogy for the latter. Anyway it's nit-picking, sorry for that. – Lærne Aug 31 '16 at 12:34
  • I understand that why the span is ${0}$ but being linearly independent is a little confusing. I think most of the confusions is due the logical things behind the statement that we want to prove being true. I knew a little about mathematical logic, so I will appreciate it if you expand on that too. :) I think it is related to $\text{False} \to P$ is always true! :) – Hosein Rahnama Aug 31 '16 at 18:29
  • 2
    @H.R. Precisely. Since there is no finite collection of vectors from ${}$, any statement you say about finite collections of vectors from ${}$ must be true! – 5xum Aug 31 '16 at 18:32
  • @H.R. Well, I did, just in other words: "(everything follows from a false statement)" – 5xum Aug 31 '16 at 18:35
  • A little more elaboration maybe! :D I will be so thankful. :) – Hosein Rahnama Aug 31 '16 at 18:35
  • Why by the same logic we cannot say it is linearly dependent!? :) – Hosein Rahnama Aug 31 '16 at 23:33
  • 3
    @H.R. Because the statement "$A$ is linearly dependend" does not start with "for any nonempty subset of $A$". Thather, the statement starts with "There exists a nontmpty subset of $A$", and such a statement is clearly false. – 5xum Sep 01 '16 at 08:55
  • In your answer, in the definition of linearly independent, I see finite instead of non-empty. Are these two equivalent? :) Maybe the definition should be every non-empty finite subset! right? :) – Hosein Rahnama Sep 01 '16 at 09:26
  • 1
    @H.R. No, in fact, it should say "finite non-empty". – 5xum Sep 01 '16 at 09:29
  • It is not necessary to say "non-empty". If you take the empty set of $a$'s, then vacuously each $\alpha$ is always $0$. – Eric Wofsey Sep 01 '16 at 09:37
  • @EricWofsey True. But it doesn't hurt to say non-empty either :). matter of prefference – 5xum Sep 01 '16 at 09:38
  • I disagree--it does hurt a little, because it is less natural and can hinder in understanding concepts like "the set of linear combinations of elements of the empty set". More generally, it is not good to get into a habit of making exceptions for the empty set. – Eric Wofsey Sep 01 '16 at 09:41
  • 1
    As a side-note: I am uncomfortable when bases are modeled as sets of vectors, I strongly prefer families of vectors. Same goes for linearly independent families (vs. sets). Otherwise one has the awkward situation that the columns of a singular square matrix may be linearly independent! – Hagen von Eitzen Sep 01 '16 at 21:14
  • @HagenvonEitzen Agreed. It's much cleaner to talk about families. – 5xum Sep 02 '16 at 06:02
5

Definition 1. The span of a set of vectors $\{v_1,\ldots,v_m\}$ is the set of all linear combinations of $\{v_1,\ldots,v_m\}$. In other words, $$\text{span}\{v_1,\ldots,v_m\}=\{a_1v_1+\cdots+a_mv_m,\, a_1,\ldots,a_m\in\mathbb{F}\}.$$

This definition leaves out the case for $\{\}$: there is no vector to begin with! So we need to take care of that. But how do we define the span of $\{\}$? We define it to be $\{\}$? Or some arbitrary space? Here is the rationale for defining $\text{span}\{\}$ to be $\{0\}$:

Proposition. Let $V$ be a vector space. Let $S$ be a finite subset of $V$ that spans $V$. One can obtain a basis of $V$ by deleting elements from $S$.

Only then can we have this proposition working for $V=\{0\}$.

To summarize, when our definition of span is as in Definition 1, we want the following extra definition

  1. The empty set is independent;
  2. The span of the empty set is the zero space $\{0\}$

for the above proposition to be true for $V=\{0\}$. As a consequence of our definition, the empty set is a basis for the zero vector space.

(Notes: My definition of linear independence is:

A set of vectors $\{v_1,\ldots,v_m\}$ is said to be linearly independent if the equation $a_1v_1+\cdots+a_mv_m=0$ always implies $a_1=\cdots=a_m=0$. Otherwise, it is said to be linearly dependent.

And I do not define the "empty sum", so that the case $\{\}$ is left undetermined. )


Definition 2. The span of a set of vectors $\{v_1,\ldots,v_m\}$ is the smallest vector space containing $v_1,\ldots,v_m$.

Under this definition, indeed we do not need to additionally define the span for $\{\}$, as @5xum pointed out.


Definition 1 is more common, since elements of the set $\text{span}\{v_1,\ldots,v_m\}$ are described explicitly. The drawback of Definition 2 is that you don't know what the elements in the span look like, and you need to prove that the span of $\{v_1,\ldots,v_m\}$ indeed consists of linear combinations of $v_1,\ldots,v_m$.

Fei Li
  • 1,316
  • 2
    I object to using the term "need to define". You don't need to define that. There are perfectly valid definitions of "independent" and "span" that directly imply that ${}$ is independent and that its span is ${0}$. – 5xum Aug 31 '16 at 12:32
  • Note that your definition of "span" is different from here's. There is no way to deduce the implication and you need to take care for the ${}$. – Fei Li Aug 31 '16 at 12:35
  • So the OP's question ultimately depends on the definition of span. – Fei Li Aug 31 '16 at 12:36
  • Thanks for the answer but I am not really OK with it because your definitions are based on lists while conclusions and the proposition are using sets! – Hosein Rahnama Aug 31 '16 at 13:38
  • Edited for consistency. – Fei Li Aug 31 '16 at 14:38
  • @FeiLi: Thanks. :) – Hosein Rahnama Aug 31 '16 at 16:35
  • Would you please add your definition of linearly independence too. Because one may ask that why The empty set is independent is considered to be an extra definition. Maybe according to a definition it is a result not an extra definition. :) – Hosein Rahnama Sep 01 '16 at 09:42
  • 2
    Note that if one first defines linear dependence for a set of $m$ vectors, namely there exists $m$ scalars, not all zero, such that the linear combination $\sum a_iv_i$ is zero, and otherwise to be linearly independent, then the linear independence of ${}$ would immediately follow. – Fei Li Sep 01 '16 at 11:50
  • 1
    I can perfectly see the reason for defining the empty sum to be zero. And note two things: (1) most linear algebra textbooks wouldn't bother to interrupt the discussion of linear algebra to talk about the "empty sum"; (2) if one defines linear dependence as in my above comment, and defines span as in Definition 2, then we can deduce that ${}$ is linearly independent and spans ${0}$, thus is a basis for ${0}$, without resorting to empty sum. – Fei Li Sep 01 '16 at 11:51
  • For notifying somebody you should type below their answers or posts. Otherwise use @ and type without any spaces their username. :) I saw your comments by accident. :) Thanks for including the definition of linearly independence. :) – Hosein Rahnama Sep 01 '16 at 20:35
  • About the approach, I do agree with you that excluding the definitions for the ${}$ and the zero vector space ${0}$ is nice and easy because intuition can accept them easily. :) However, it is always demanding to have just one definition for all of the cases. But if we want to use just one definition we have to cope with concepts like empty sum or vacuous truth. :) – Hosein Rahnama Sep 01 '16 at 20:54
  • Would you please take a look at this – Hosein Rahnama Sep 03 '16 at 10:42
2

A basis has several equivalent definitions. One of which is:

  1. A basis of a vector space is a minimal generating set

So keeping that in mind, if we look at $V = \{0\}$, the only non-empty subset of this vector space is $B = \{0\}$. This set $B$ is a linearly dependent set and thus it cannot be a basis.We make a note here that $B$ is a generating set of $V$. We know by existence of basis of a vector space that $V$ must also have a basis. So if a basis were to exist it should be a subset of $B$. As removing a linearly dependent element from a generating set does not change the span of that set, $\phi$ is a generating set.

Now the definition of a linearly dependent set in crude language is: In a vector space $V$, a subset $A$ of $V$ is said to be linearly dependent if there exists an element which can be written as a finite linear combination of the rest of the elements.

So, consider the set $\phi$; there does not exist any element in it which can be written as a finite linear combination of the other elements, hence it is not a linearly dependent set and therefore it is linearly independent.

Therefor we see that $\phi$ is a linearly independent and generates $V$ and hence a basis.

  • 1
    $\emptyset$ should be used for empty set, not $\phi$ , or you can use $\varnothing$ – john Jun 04 '21 at 04:28
-2

To answer the question of why $\text{Span}\{\}=\{0\}$ is true, I considered the following argument for myself.

I think all the things draw back to the operation of addition.

Addition is a map defined as follows

$$ \begin{align} (\cdot+\cdot): & V \times V \to V \\ & (u,v) \mapsto (u+v) \end{align}$$

with the commutative and associative properties

$$ \begin{align} (u+v) &= (v+u) \\ ((u+v)+w) &= (u+(v+w)) \end{align} $$

so according to this definition, whenever we are talking about addition we should provide two inputs to get one output.

From a programming point of view it is useful to have outputs in case when we have one or no inputs (see the detail of Plus function in wolfram language). It also turns out to be useful in proofs like the ones using induction. So what is the most useful definitions to make for such cases? Experience shows that these are

$$ \begin{align} (u+\text{null}) &= u\\ (\text{null}+u) &= u \\ (\text{null}+\text{null}) &= 0 \end{align} \tag{1}$$

where you can think of null meaning that no argument is provided!

Now the following definition can be easily interpreted in the special cases when we have a set with one element or no element.

Linear Combination and Span. A linear combination of a set $A=\{v_1,v_2,...,v_m\} \subseteq V$ is a vector $v$ defined by $v=\sum_{j=1}^{m}a_jv_j$. The set of all linear combinations of $A$ is called the span of $A$ denoted by $\text{Span}A$.

Now, if we make the convention that $m=0$ means $A=\{\}$ and when $m=1$ then $A=\{v_1\}$, according to $(1)$, we can interpret the definition as follows

$$ v=\sum_{j=1}^{m}a_jv_j:=s_m, \qquad s_i = \begin{cases} 0, & i=0 \\ (a_1v_1+\text{null}), & i=1 \\ (a_1v_1+a_2v_2), & i=2 \\ (s_{i-1}+a_{i}v_{i}), & \text{otherwise} \\ \end{cases} , \qquad 0 \le i \le m $$

So, we can see that $\text{Span}\{\}=\{0\}$. Note that this is a result of our own convention for the addition operation and the definition of the span. I think that the vacuous truth argument has no advantage over this one! However, it keeps repeating in many other examples! So it is good to learn it once and for all!

  • I would say this interpretation is... unwise. It is simpler to define a linear combination $v = \sum_{j=1}^m a_jv_j$ as $s_m$, where $s_i = \begin{cases}0 & \text{if $i=0$}, \ s_{i-1}+a_iv_i & \text{otherwise.}\end{cases}$ –  Nov 06 '17 at 22:06
  • @Rahul: Thanks for the attention. :) You are right. :) You wrote the last equation more briefly and efficiently. :) – Hosein Rahnama Nov 06 '17 at 22:15
  • 1
    And this way you don't have to redefine vector addition to act on zero or one operands; we still always add exactly two vectors at a time :) –  Nov 06 '17 at 22:48
  • @Rahul: Indeed, my explanation after Eq.$(1)$ is exactly what you said. :) Maybe I wrote it down a little chatty and lengthy for providing simpler cases and motivation. – Hosein Rahnama Nov 06 '17 at 22:51