1

Let $a_1, b_1 \in \mathbb{Z}$ be arbitary integers with $a_1 \neq 0$, and define sequences $\{a_n \}_{n \geq 1}$ and $\{b_n \}_{n \geq 1}$ to be solutions to the following linear system of recursions:

\begin{cases} a_{n+1} = (b_1 - a_1)a_n + (a_1)b_n \\ b_{n+1} = (-2a_1)a_n + (b_1)b_n \end{cases}

I have done the following manipulations which give a recursive formula for $a_n$. First, observe the first equation gives us $b_{n} = \dfrac{a_{n+1} - (b_1 - a_1)a_n}{a_1}$. Substituting this into the second equation, we see $$b_{n+1} = (-2a_1)a_n + b_1 \Big(\dfrac{a_{n+1} - (b_1 - a_1)a_n}{a_1}\Big).$$ Reindexing $n + 1 \rightarrow n$, we achieve $$b_n = (-2a_1)a_{n-1} + b_1 \Big(\dfrac{a_{n} - (b_1 - a_1)a_{n-1}}{a_1}\Big).$$ Finally, substituting this back into the first equation, we get: $$ a_{n+1} = (b_1 - a_1)a_n + a_1 \Big[(-2a_1)a_{n-1} + b_1 \Big(\dfrac{a_{n} - (b_1 - a_1)a_{n-1}}{a_1}\Big) \Big].$$ Simplifying, we find $$a_{n+1} = (2b_1 - a_1)a_n + (a_1b_1 - b_1^2 - 2a_1^2)a_{n-1}.$$

I suspect, and would like to prove, that $a_n \neq 0$ for all $n \geq 1$ and for all choices $a_1, b_1 \in \mathbb{Z}$ where $a_1 \neq 0$.

As corrected in the comments, $\lim\limits_{n \rightarrow \infty} \frac{a_{n+1}}{a_n}$ does not appear to be $-3$, so my claim of well-behavedness asymptotically isn't even true. Any suggestions/hints for forward movement are appreciated.

2 Answers2

2

Set $a_0=0$ and $b_0=1$. Then the original recurrence works for $n\ge0$. Write it in matrix form: $$ \begin{pmatrix} a_{n+1} \\ b_{n+1} \end{pmatrix} = \begin{pmatrix} b_1 - a_1 & a_1 \\ -2a_1 & b_1 \end{pmatrix} \begin{pmatrix} a_{n} \\ b_{n} \end{pmatrix} = A \begin{pmatrix} a_{n} \\ b_{n} \end{pmatrix} $$ Then $$ v_{n}= \begin{pmatrix} a_{n} \\ b_{n} \end{pmatrix} = A^n \begin{pmatrix} a_{0} \\ b_{0} \end{pmatrix} = A^n \begin{pmatrix} 0 \\ 1 \end{pmatrix} = A^n v_0 $$ Therefore $$ A^n = \begin{pmatrix} * & a_n \\ * & b_n \end{pmatrix} $$ If $a_n=0$, then $v_0$ is an eigenvector of $A^n$ with eigenvalue $b_n$. The eigenvectors of $A^n$ are the same as the eigenvectors of $A$, and these have nonzero first coordinates, unlike $v_0$.

The calculations were made with WA.

lhf
  • 216,483
  • 1
    Where does this argument break when $a_1=2, b_1=1$? Oh, the eigenvalues for powers may become the same! https://math.stackexchange.com/questions/241764/eigenvalues-and-power-of-a-matrix – lhf Feb 23 '22 at 12:58
  • 1
    It seems that the argument breaks exactly when $A$ has zero trace, that is, $a_1=2b_1$. – lhf Feb 23 '22 at 13:04
1

Write the sequence in the matrix form, i.e. let $A_n = <a_n,b_n>$ and let $A$ be matrix with entries $A[1,1] = b_1-a_1$, $A[1,2] = a_1$, $[2,1] = -2a_1$, and $A[2,2] = b_1$. Then the equations defining the sequence can be written as $A_{n+1} = A*A_n$. By induction one gets that $A_{n+1}= A^n*A_1$. Now, if $A$ can be diagonalized (i.e. it is the case if it has two distinct eigenvalues), say $A= C*D*C^{-1}$ then we get $A_{n+1} = C*D^n*C^{-1}A_1$. From this formula one can see that if $\lambda_1$ and $\lambda_2$ are eigenvalues of $A$ then $A_{n+1}$ is a combination of their powers, i.e. $$a_{n+1} = M*(\lambda_1)^{n} + N*(\lambda_2)^{n}$$ where $M,N$ are some constants. Similar formula holds for $b_n$ (with different constants). So the problem reduces to find eigenvalues of $A$ that is roots of $(b_1-a_1 - t)(b_1-t) + 2a_1b_1 = 0$ i.e. $$t*2 - (2b_1+ a_1)t +2a_1b_1 = 0$$ Once finds $\Delta$ for this equation to be $\Delta = (2b_1+ a_1)^2-8a_1b_1 = (2b_1 -a_1)^2$. Clearly, as long as $\Delta$ is non zero we get two distinct eigenvalues. As remarked in one of the comments, if (for example) $a_1=2, b_1=1$ we may get zero.

Salcio
  • 2,495