4

What is the proof for / where might I find the proof to:

Let $c_1, c_2,..., c_k$ be real numbers. Suppose that the characteristic equation

$$r^k-c_1 r^{k-1}-...-c_k=0$$ has $k$ distinct roots $r_1, r_2,..., r_k$. Then a sequence $\{a_n\}$ is a solution of the recurrence relation

$$a_n=c_1a_{n-1}+c_2a_{n-2}+...+c_ka_{n-k}$$

if and only if

$$a_n=\alpha_1r_1^n+\alpha_2r_2^n+...+\alpha_kr_k^n$$

for $n = 0,1,2...$, where $\alpha_1,\alpha_2,...,\alpha_k$ are constants.

2 Answers2

4

A nice proof is given by using generating functions. Say your recurrence is:

$\begin{align*} c_k a_{n + k} + c_{k - 1} a_{n + k - 1} + \dotsb + c_0 a_n = 0 \end{align*}$

with given values for $a_0, \dotsc, a_{k - 1}$. Define the generating function $A(z) = \sum_{n \ge 1} a_n z^n$, multiply your recurrence by $z^n$ and add up over $n \ge 0$. Recognizing some sums you have:

$\begin{align*} c_k \frac{A(z) - a_0 - a_1 z - \dotsb - a_{k - 1} z^{k - 1}}{z^k} + c_{k - 1} \frac{A(z) - a_0 - a_1 z - \dotsb - a_{k - 2} z^{k - 2}}{z^{k - 1}} + \dotsb + c_0 A(z) &= 0 \end{align*}$

Multiply through by $z^k$ and solve for $A(z)$ to get:

$\begin{align*} A(z) &= \frac{p(z)}{c_k z^k + c_{k - 1} z^{k - 1} + \dotsb + c_0} \end{align*}$

for some polynomial $p(z)$ of degree at most $k - 1$ that depends on the initial values. Note that the denominator is the characteristic equation of the recurrence. From calculus you know that this can be written as partial fractions:

$\begin{align*} A(z) &= \sum_r \frac{A_r}{(1 - \alpha_r z)^{m_r}} \end{align*}$

where $1 / \alpha_r$ is a zero of the characteristic equation of multiplicity $m_r$. Now, by the extended binomial theorem:

$\begin{align*} (1 - \alpha z)^{-m} &= \sum_{n \ge 0} (-1)^n \binom{-m}{n} (\alpha z)^n \\ &= \sum_{n \ge 0} \binom{m + n - 1}{m - 1} \alpha^n z^n \end{align*}$

But the binomial coefficient $\binom{m + n - 1}{m - 1}$ is just a polynomial in $n$ of degree $m - 1$. Thus the solution is a sum of terms of the form $\alpha^n$ multiplied by a polynomial in $n$ of degree at most $m - 1$ if $\alpha$ is a zero of multiplicity $m$.

If any zero turns out complex, there will be a pair of complex conjugate zeroes. Writing out the complex numbers in polar form $\alpha = \rho \exp(\omega i)$ you get $\alpha^n = \rho^n \exp(n \omega i)$, giving rise to terms of the form $\rho^n \cos(\omega n)$ (they will turn out in conjugate pairs, the imaginary parts cancel out).

vonbrand
  • 27,812
3

This can be shown with matrices. Note that your relation can be expressed in matrix/vector form as a system of equations.

$$\overrightarrow v_{n+1}= \mathbf M \cdot \overrightarrow v_n$$ We know the solution to this recurrence relation.... $$\overrightarrow v_n=\mathbf M^n \cdot \overrightarrow v_0$$ This can be evaluated using eigenvalues, hence this is why you have eigenvalues in the solutions.

Zach466920
  • 8,341