4

In calculus textbooks we read that a sequence is a function whose domain is the set of positive integers. While in french textbooks we read that a sequence is a function whose domain is the set of non-negative integers. My question is about excluding zero from the definition of a sequence, what is the convention/reason behind that ? Thank you for your help!

palio
  • 11,064
  • Maybe relevant: http://math.stackexchange.com/questions/283/is-0-a-natural-number – Matthew Conroy Jan 08 '16 at 19:41
  • 2
    If you intend to count the sequence in any way, start at $1$, that way $x_1, x_2, ... x_n$ consists of exactly $n$ elements. If the sequence is defined by a recurrence $x_{n+1} = f(x_n)$, it's often nicer to start at $0$, that way an element's index tells you how many times it's had $f$ applied to it. – Jack M Jan 08 '16 at 19:53

2 Answers2

5

It doesn't matter. You simply have to choose a possible definition and use it consistently. Sequences don't even have to start at $0$ or $1$, they can start with any number and even be indexed by all integers $\mathbb{Z}$. There is also the more general notion of an indexed family.

Dominik
  • 19,963
5

Both are conventions, less a matter of reason than of custom. Logicians and set theorists count from $0$, as do many programming languages by default (C, Python, ...); other mathematicians prefer to count from $1$, and some programming languages do too by default (e.g. Basic, once upon a time).

Anyone can justify their preference for one convention over the other. I prefer to count from $0$: it makes for a smoother set-theoretic development of the integers, rationals and reals; and counting from $1$ seems an accommodation of lingering medieval suspicions that $0$ isn't really a natural number.

As @Dominik says in his answer, ultimately the choice doesn't matter (the resulting theorems are the same); what matters is to be clear and consistent about which convention you adhere to.

BrianO
  • 16,579