0

From the 3rd edition of the book "The Linear Algebra a Beginning Graduate Student Ought to Know" by Jonathan S. Golan, we find the following exercise under chapter 3:

enter image description here:

First of all, let's interpret the term "subspace" as meaning a proper subspace (otherwise we can just choose $V$ itself as the single subspace we need and voilà, all the elements are there and we have disproven the proposition of the question).

Now, I have tried to prove this statement by contradiction, but have fallen short of completing the proof.

Summary of the proof attempt so far...

So to begin, let's assume the opposite. Since $p$ is finite, there must exist some minimal $k^{'}$ s.t. our statement holds. Correspondingly, let our minimal (in number) and sufficient subspaces be denoted by:

$$ S_1, S_2, ..., S_{k^{'}}\ s.t.\ \bigcup_{i=1}^{k^{'}}S_i=V$$

Note that by minimality, all subspaces have to be non-degenerate (more than just ${0_V}$). Now, Let:

$$\bigcup_{i=2}^{k^{'}}S_i=S_{tail}$$

Clearly, both $S_1$ must have at least 1 element that is not in $S_{tail}$ and vice versa. Otherwise, all elements from one subspace/set would be contained in the other, and the subspace / set of subspaces with "repeated" elements could be dropped from the initial minimal list, but this would contradict our assumption that $k^{'}$ is the minimal number for the number of subspaces needed to contain all the elements of $V$.

So, at least one element must exist on both sides that is not found in the other side. Take such elements from both sides and add them. The resultant $s_1 + s_{tail}$ can not be in $S_1$, as otherwise the element $s_{tail}$ would be in $S_1$, leading to a contradiction. On the other hand, let $i$ denote an index from $\{2,...,k^{'}\}$ s.t. $s_{tail} \in S_i$. By the same argument, we have: $s_1 + s_{tail} \notin S_i$.

However, unless $k^{'}=2$, we can not take the final step of our proof. Indeed we want to show that $s_1 + s_{tail} \notin S_{tail}$, as we will have then identified an element of $V$ (namely: $s_1 + s_{tail}$) that is not in $\bigcup_{i=1}^{k^{'}}S_i$, leading to a contradiction that completes our proof.

However, since $S_{tail}$ is not a subspace because it is simply a union of subspaces of $V$, our previous argument can not be symmetrically applied to finish this proof and we are left at an impasse (e.g. it could be that for some $j \in \{2,...,i-1,i+1,...,k^{'}\}$, it holds that $s_1 + s_{tail} \in S_j$).

Hence, this proof falls short at the final step. Now I believe that if this proof by contradiction can be finished, we will have to exploit the fact that $F = GF(p)$, as we have not touched upon it at all in this proof.

Perhaps we have to exploit the fact that all subspaces have orders of the form $n(p-1) + 1$ for $n \in \mathbb{N} $, or perhaps that all elements have an additive order of $p$. Perhaps we have to split the cases where $V$ is finitely generated and where it is not... It is worth noting that our argument can be completed if $k^{'}=2$, as we then can in fact apply our argument symmetrically. Maybe we can then proceed by induction up to $p$?

Could you help me out if you have any ideas or clues (or an alternative way of deriving the result). Thank you!

Just_a_fool
  • 2,256

1 Answers1

2

First, consider the case where $V$ is finite dimensional.

This is actually a simple cardinality argument.

Let $k$ be the finite field in question (it can be any finite field); let its cardinality be $p$. Then we are dealing with a vector space of the form $V = k^n$.

Now a proper subfield of $V$ will be of dimension $< n$, so its cardinality will be at most $p^{n - 1}$ (note that we may need to argue separately for $n = 0$, but this is trivial). So the union of $p$ such proper subfields will have cardinality at most $p \cdot p^{n - 1} = p^n$, and we can only achieve $p^n$ when all these subfields are disjoint. But the problem is that $p \geq 2$, and $0$ is an element of every subfield, so equality is not achieved.

Now, we drop the restriction that $V$ is finite dimensional. Consider $S_1, \ldots, S_p \subsetneq V$. For each $i$, choose $a_i \in V \setminus S_i$.

Let $U$ be the subspace of $V$ generated by the $a_i$; note that $U$ is finite dimensional. Then let $R_i = S_i \cap U$ for all $i$. Now each $R_i$ is a proper subspace of $U$, so we see that $U \neq R_1 \cup \cdots \cup R_p$. Then we see that $V \neq S_1 \cup \cdots \cup S_p$.

Mark Saving
  • 31,855
  • 2
    This is the finite dim case only. – Arturo Magidin Apr 30 '22 at 17:24
  • @ArturoMagidin Good point. I will edit in a fix to cover the general case momentarily. For some reason, I thought the problem only dealt with finite dimensional spaces. – Mark Saving Apr 30 '22 at 17:32
  • To be fair, so did I, though the OP indicated otherwise in comments. – Arturo Magidin Apr 30 '22 at 19:02
  • Thank you for this answer! I realise I was missing this idea in my thinking: If you have a finitely generated vector space (generated by $n$ vectors) over a finite field of order $p$, then the cardinality of the space is $n^p$. That is correct right? Because I thought some linear combinations of generators might yield some other previously counted terms (leading to double counting) but if that happens, then the generators were dependent and we can assume, from the start, that $n$ is low enough that we have no dependence left! Also - very nice extension of your argument to the infinite case! – Just_a_fool May 01 '22 at 08:28
  • 1
    @Just_a_fool cardinality is $p^n$, not $n^p$. Think about coordinate vectors: how many entries, and how many possible values per entry? – Arturo Magidin May 01 '22 at 21:22
  • @ArturoMagidin yup I had it the wrong way around - that makes sense! the total number of different possible linear combinations in this space will be exactly $p^n$ – Just_a_fool May 02 '22 at 08:10