1

I have been struggling for weeks trying to prove convergence of a recursive sequence of vectors. I can prove it for one dimension using upper and lower bounds, but I can't figure out how to prove that it converges in all dimensions. I wonder if I should use the Banach fixed-point theorem, or Cauchy sequences, or something clever like that? But I can only find fairly simple examples on the internet, and I am not really a mathematician myself so I am struggling to generalize those simple examples to this more complicated problem.

Here is the problem:

We are given $\vec{a} = [a_1, a_2, .., a_N]$ which is a vector of constant positive numbers between zero and one (both inclusive). And we are also given an $N \times N$ matrix denoted $C$ of constant positive numbers between zero and one (both inclusive), and with the diagonal equal to 1 so $C_{i,i}=1$, and the other elements are symmetrical so $C_{i,j} = C_{j,i}$.

We now want to find some $\vec{x}$ that causes another $\vec{y}$ to equal $\vec{a}$. I don't think this is possible through a closed-form solution, so we define $\vec{x_n}$ and $\vec{y_n}$ as sequences, and recursively update $\vec{x_n}$ until it hopefully causes $\vec{y_n}$ to converge to $\vec{a}$.

Let $\vec{x_n}$ be a sequence of vectors where the $N$ elements are written as $\vec{x_n} = [x_{1,n}, x_{2,n}, .., x_{N,n}]$, and the elements are updated recursively as follows: $$ x_{i,n+1} = a_i \cdot \frac{x_{i,n}}{y_{i,n}} $$ The other sequence $\vec{y_n} = [y_{1,n}, y_{2,n}, .., y_{N,n}]$ is defined from $\vec{x_n}$ as follows: $$ y_{i,n} = \sqrt{ \sum_{j=1}^N x_{i,n} \cdot x_{j,n} \cdot C_{i,j}} $$

I have implemented it in a computer program and tested it on lots of data and it converges very quickly to the correct solution. But I cannot formally prove that it converges.

We can note a few things that may be helpful.

First rewrite $y_{i,n}$ as follows: $$ y_{i,n} = \sqrt{ \sum_{j=1}^N x_{i,n} \cdot x_{j,n} \cdot C_{i,j}} = \sqrt{x_{i,n}^2 + \sum_{j \neq i} x_{i,n} \cdot x_{j,n} \cdot C_{i,j}} \geq \sqrt{x_{i,n}^2 } = x_{i,n} $$ So we have $y_{i,n} \geq x_{i,n}$ and therefore $x_{i,n}/y_{i,n} \leq 1$ and hence: $$ x_{i,n+1} = a_i \cdot \frac{x_{i,n}}{y_{i,n}} \leq a_i $$ So the recursive sequence $\vec{x_n}$ is bounded between zero and the constant positive vector $\vec{a}$.

Also note that the recursive sequence $x_{i,n}$ has two fixed-points: $x_{i,n}=0$ and $y_{i,n}=a_i$.

Further note that $y_{i,n}$ is continuous and strictly increasing in both $x_{i,n}$ and $x_{j,n}$ for all other $j$.

How do we prove that $\vec{y_n}$ converges to $\vec{a}$ when $\vec{x_n}$ is updated as above?

Ideally I would like a convergence proof that also works for other definitions of $\vec{y_n}$, so the proof isn't tied to the exact definition used here, but instead uses certain assumptions about the sequence $\vec{y_n}$. But if that is too difficult, then any convergence proof would do.

As I am not really a mathematician, I would greatly appreciate if you carefully explain the steps in your convergence proof.

Furthermore, can we say anything about the convergence time? I have observed that it is extremely fast and only requires a very small number of iterations (e.g. less than 10), depending on how close we want $\vec{y_n}$ to $\vec{a}$. It appears to have roughly linear time-complexity in the number of elements $N$, because the number of iterations for each element depends on the required precision, and it approaches the correct solution in just a few iterations.

Thanks very much for your help!

0 Answers0