I discovered and proved the following theorem back in high school, and have waited patiently to hear something about throughout my college career (which is nearing it's end, hope to have finished my doctorate within the year) to no avail. I am hoping that some of you may recognize the theorem and be able to shed some light on it's history, or even give me it's proper name. Having nothing better to call it I have referred to it as "The fundamental theorem of discrete calculus" due to it's similarity to the fundamental theorem of calculus. Anyway, here is the theorem (presented much more clearly then I did back in high school).
Let $p(x)$ be a polynomial of degree $d$ have the following representation in the binomial basis, $$p(x) = \sum_{i=0}^d a_i \binom{x}{i}$$
Also let $C$ be an arbitrary constant and let$$P(x) = \sum_{i=0}^d a_i \binom{x}{i+1} + C$$
Then $$\sum_{i=a}^b p(i) = P(b+1) - P(a)$$
This is clearly analogous to the fundamental theorem of calculus where $P$ is like a "discrete antiderivative" of $p$.
Also of note is that this trivializes many of the induction proofs that we go through when first learning induction, such as $$\sum_{i=1}^n i = \binom{n+1}{2}$$
So anything you can tell me about the history of the theorem or the subject of "discrete calculus" (which I don't even know if that is the proper name) would be greatly appreciated, thanks in advance!
Also according to wikipedia a major application is in numerical approximation of differential equations. But over the years I have used this (or related ideas) several times while working out problems in real analysis, combinatorics and number theory. So is there a reason this is not taught anywhere but presumably in numerical analysis?
– Elliot Feb 22 '14 at 02:17