How does one go about showing that the dimension of a vector space of infinite sequences is uncountable? My methods was to try and show the existence of an uncountable, linearly independent of sequences which implies that the basis must be uncountable. I have tried the set of following sequences i.e. 1) An = n^r where r is any real number 2) set of convergent sequences. However, I cant seem to show that they are uncountably infinite dimensional. Any tips? P.S Other solutions are welcomed but i have read other similar posts and do not understand the solutions
-
1Maybe by contradiction? Give a countable basis; then, with a diagonal argument, give an element that cannot be linear combination of the elements of the basis – edgar alonso Feb 12 '17 at 03:42
-
@edgaralonso that works well once you have a topology with which to make sense of sums like $\sum_{n=1}^\infty \frac{1}{2^n}v_n$ – Ben Grossmann Feb 12 '17 at 03:58
-
I agree the questions are duplicates but is the best approach to close the one that was well received, as a duplicate of something that was closed and downvoted (and not improved)? – Jonas Meyer Feb 12 '17 at 05:37
-
@Jonas Meyer Hi I was browsing through other forum posts and i stumbled across the uncountability of bases of the vector space of all functions f:N->R. Are both these cases similar? – Jhon Doe Feb 12 '17 at 09:44
-
@JhonDeo: functions from $\mathbb N$ to $\mathbb R$ are the same thing as real valued sequences. They are not only similar, they are identical. Could you share the link? – Jonas Meyer Feb 12 '17 at 16:04
-
@JonasMeyer http://math.stackexchange.com/questions/2140824/zorns-lemma-proof-of-uncountable-basis?noredirect=1#comment4403385_2140824 I asked – Jhon Doe Feb 12 '17 at 16:10
2 Answers
The set of sequences $\{(1,t,t^2,t^3,\ldots):t\in\mathbb R\}$ is a linearly independent set with the cardinality of $\mathbb R.$
One way to show it is linearly independent is to show that each element is an eigenvector for the "backward shift," each with distinct eigenvalues, and eigenvectors for distinct eigenvalues of a linear transformation are always linearly independent.
Because the cardinality of real sequences is the same as that of $\mathbb R$, this lower bound is the exact dimension.

- 53,602
-
-
One could note that the Vandermonde matrix associated with non-repeating $\alpha_i$ is necessarily invertible – Ben Grossmann Feb 12 '17 at 03:59
-
Yes. If you take any $n$ of these sequences, and even just look at the first $n$ terms in the sequence, you get linearly independent vectors in $\mathbb R^n$. They form a Vandermonde matrix, for which the determinant has a nice formula, and you can read more at Omnomnomnom's link. – Jonas Meyer Feb 12 '17 at 03:59
-
What does the backwards shift do with the first element of the sequence? Could you say it takes the inverse of the second element, and puts in on the first element? Because that way we $(1,t,t^2,t^3,\dots)$ will indeed be an eigenvector with eigenvalue $t$. – Sha Vuklia Dec 16 '18 at 15:57
-
@ShaVuklia: It moves everything to the left one space except the first, which is removed. $(a_0,a_1,a_2,a_3,\ldots)$ is sent to $(a_1, a_2,a_3,a_4,\ldots)$. – Jonas Meyer Jan 08 '19 at 01:06
If I were you, I'd consider the characteristic functions of subsets of $\mathbb{N}$ - that is, sequences of zeroes and ones. Because the basis is only permitted to "cover" other elements using finite addition, you can't reach (for example) a sequence with infinitely many $1$'s using only sequences with finitely many $1$'s.
I think you can probably use a variant of Cantor's diagonalization argument to argue that no countable collection of these characteristic functions spans the rest.

- 17,988
-
In the solution to Problem 7 in Halmos's Hilbert space problem book, a closely related idea is used in showing that every infinite dimensional Hilbert space has dimension at least $2^{\aleph_0}$. It is outlined how one can construct a $2^{\aleph_0}$-sized collection of subsets of $\mathbb N$, any two of which have finite intersection. – Jonas Meyer Feb 12 '17 at 04:41