Write your formulas in a matrix form:
$$\begin{bmatrix}a_n\\b_n\end{bmatrix} = \begin{bmatrix}1&1\\r&1\end{bmatrix}\begin{bmatrix}a_{n-1}\\b_{n-1}\end{bmatrix}$$
i.e.
$$w_n=Aw_{n-1}$$
Notice that if you normalize every vector after applying $A$, that is if you consider
$$v_n=\frac{Av_{n-1}}{||Av_{n-1}||}$$
instead $w_n$, then it will generate the same $\{c_n\}$. This is because
- normalization changes magnitude of the given vector, not its direction
- $A$ is a linear transformation; changing the magnitude of its input doesn't change the direction of its output
- and finally $c_n$ depends on the n-th direction only.
This formula for $v_n$ above is exactly what power iteration does. The following citation is useful (it has been translated to match our chosen names and starting index):
If $A$ has an eigenvalue that is strictly greater in magnitude than its other eigenvalues and the starting vector $v_{1}$ has a nonzero component in the direction of an eigenvector associated with the dominant eigenvalue, then a subsequence $\{v_n\}$ converges to an eigenvector associated with the dominant eigenvalue.
Think about eigenvalues (say: $\lambda$) and how they depend on $r$. You will have to consider a quadratic equation w.r.t. $\lambda$ and its discriminant, which will be of first degree w.r.t. $r$. By analyzing the two you will find that for $r>1$ there are always two distinct real eigenvalues (and none of them is $0$, this will be important in a moment). Apply proper Vieta's formula to show they cannot sum to $0$. They cannot be equal, they cannot be opposite, so one eigenvalue is strictly greater in magnitude for sure.
If the starting non-zero vector had zero component in the direction of one of the two eigenvectors, then it would be the other eigenvector. Applying $A$ to it would act as multiplying by eigenvalue (which is nonzero). This means $w_1, w_2, w_3, …$ would have the same direction, so $c_1, c_2, c_3, …$ would be equal and $\{c_n\}$ would trivially converge. In our case however $c_1=1$ and $c_2\neq1$. This means $v_1$ has non-zero component in the direction of any of the two eigenvectors; in particular in the direction associated with the dominant eigenvalue.
So the conditions are met, $\{v_n\}$ converges to some eigenvector. It has some direction which corresponds to the limit of $\{c_n\}$.
Q: But wait! If $\begin{bmatrix}0\\1\end{bmatrix}$ was this eigenvector associated with the dominant eigenvalue, wouldn't the limit be $\infty$ or $-\infty$?
A: I think it would, it doesn't matter though. Calculate $A\begin{bmatrix}0\\1\end{bmatrix}$ and you will see it's not an eigenvector.
the only proof that doesn't at some point have the value of the limit in it
My answer proves the limit exists without giving it, too. (Note that $A_2, B_2$ are never calculated there, they are just known to exist - very much like the above doesn't actually calculate the eigenvectors.) – dxiv Sep 01 '17 at 19:57