6

Consider the following statement

$x_n \to x$ if and only if every subsequence of $x_n$ has a subsequence that converges to $x$.

$\implies$ is clear. A proof of the other direction is given here. It is a proof by contrapositive.

My question is: can it be proved directly? (I obviously tried but couldn't do it)

  • 2
    Contradiction and contrapositive are two different things. – Asaf Karagila Aug 08 '14 at 08:37
  • 1
    If you are talking about sequences in $\mathbb{R}$ then the other part can be done directly if you can show that the sub sequence that is being generated is a Cauchy Sequence. And you know that in $\mathbb{R}$ every Cauchy Sequence converges. – creative Aug 08 '14 at 08:39
  • @AsafKaragila Right, that was a (stupid) typo. The two words are so similar. –  Aug 08 '14 at 09:58
  • @H.D. I believe I would have to show the original sequence is a Cauchy sequence? Not the subsequence. –  Aug 08 '14 at 10:01
  • Showing Cauchy sub sequence is sufficient – creative Aug 08 '14 at 10:04

2 Answers2

1

As Asaf Karagila noted in the comment, the referenced proof is rather indirect. Also note that these proof types are not formally defined so it is not clear what will you accept as direct proof and what not.

The core argument of given proof is as follows: a sequence $S$ which doesn't converge to a point $x$ has a subsequence which completely misses some neighborhood of $x$ and so none of its subsequences can converge to $x$.

I'll put down something which looks more like a direct proof. First some definitions. We'll say that a sequence $S$ is eventually in a set $U$ iff $U$ contains a final segment of $S$. And we'll say that $S$ is frequently in $U$ iff it contains a subsequence which is eventually in $U$.

Note that for each $S$ and $U$ exactly one of the following cases holds:

  • $S$ is eventually in $U$, so it's frequently in $U$ but it's not frequently in $U^c$.
  • $S$ is eventually in $U^c$, so it's frequently in $U^c$ but not in $U$.
  • $S$ is frequently both in $U$ and $U^c$.

Also note that $S \to x \iff$ $S$ is eventually in any neighborhood of $x$.

Now see the following steps:

  1. Every subsequence of $S$ contains a subsequence converging to $x$.
  2. Every subsequence of $S$ contains a subsequence which is eventually in any neighborhood of $x$.
  3. Every subsequence of $S$ is frequently in any neighborhood of $x$.
  4. $S$ is not frequently in a complement of any neighborhood of $x$.
  5. $S$ is eventually in any neighborhood of $x$.
  6. $S$ converges to $x$.

Is this a direct proof for you?

user87690
  • 9,133
0

It is unclear whether you mean a metric or a topological space. I will assume (from the link) you mean metric.
I have proved this before in my analysis course I think, but not directly.
Suppose every convergent subsequence we can construct in this manner belong to the set $A$. Take the union of ${x \in x_{n_{k}} \in A}$ to obtain the set $B$ of all elements which belong to a subsequence of $x_n$ which converges to $x$. Let $X$ be the set of all elements in $x_n$. Clearly, $X \setminus B$ is finite, as if we suppose not then we can find another subsequence of $x_n$ with elements in $X \setminus B$ which has a subsequence which converges to $x$. This subsequence belongs to $A$ so its elements belong to $B$ so its elements don't belong to $X \setminus B$, contradiction. It is now sufficient to show that the subsequence of $x_n$ containing all elements in $B$ converges to $x$ (as there are finitely more elements of $x_n$ not in this subsequence). Call it $x_{n_{k}}$. It satisifies all the conditions that $x_n$ satisfied, so we can repeat this process, to get $x_{n_{k2}}$, etc.
This is the subtle part. If, repeating this process, we never get a subsequence of $x_n$, say $x_{n_{kr}}$, with $x_{n_{kr}} = x_{n_{k(r-1)}}$ (as sequences) then, ad infinitum, we get an infinite set of points in $X$ which must be members of $X \setminus B$. Contradiction. For this $r$, $x_{n_{kr}}$ is a sequence with the following properties: all of what $x_n$ had, and the fact that the subsequences of the subsequences which converge to $x$ cover $x_{n_{kr}}$ (interpreted intuitively, as with the sets presented before), $x_{n_{kr}}$. So we can find a finite subcover (ok, I'm assuming compact too). Hence, $ \forall \epsilon > 0 \exists N_1,...,N_k$ after which these subsequences converging to $x$ are within $\epsilon$ of $x$ (notation gets out of hand here, so just being wordy, you know what I mean). Then for $n \geq max(N_1,...,N_k)$, the nth element onwards of $x_{n_{kr}}$ is within $\epsilon$ of $x$. As there are finitely many elemens ot $x_n$ not in $x_{n_{kr}}$, the same holds true for $x_n$, i.e. it converges to $x$.

Note: I am out of time so if someone could help with the notation and explanation I'd be hugely grateful :).

ShakesBeer
  • 3,641