Let $(b_n)_{n=1}^{\infty}$ be a sequence of real numbers in the interval $[0,1]$. Assume that for every $n,m\geq 1$ it satisfies $$ b_{n+m} \leq \frac{n}{n+m} b_n + \frac{m}{n+m} b_m. \tag{1}\label{eq1} $$ That is, the point $b_{m+n}$ is below a specific convex combination of the points $b_n$ and $b_m$.
Exercise: Show that the sequence $(b_n)_{n=1}^{\infty}$ converges. Moreover, it converges to $b := \inf_{n\geq 1} b_n$.
Ideas: My first attempt was to show that it is Cauchy. For that I take $n > m$, and I can conclude $$ b_n \leq \frac{m}{n} b_m + \frac{n-m}{n} b_{n-m}, $$ but this only gives a (not very useful, I think) upper bound for $b_n - b_m$, and it says nothing about $b_m - b_n$ (I'm trying to estimate the absolute value $|b_n - b_m|$).
In a second attempt I noticed that a particular case of \eqref{eq1} is that for every $k\geq 0$ we have $b_{2^{k+1}} = b_{2^k + 2^k} \leq b_{2^k}$, thus the subsequence $(b_{2^k})_{k=0}^{\infty}$ is decreasing, so it converges to its infimum, call it $b$. Then given $n\geq 1$ arbitrary I tried to split it as $n = 2^k + m$, with $0\leq m < 2^k$, and estimate $b_n$ both below and above in terms of $b_{2^k}$. However all the inequalities I try, such as $b_n\leq 2b_{2^k}$ or similar, fail (even creating recursively a sequence at random, using Python, that satisfies \eqref{eq1}, the bounds I try fail sometimes).
Any idea on how to prove this with elementary analysis? Maybe the exercise is wrong and there are counterexamples? Any help is appreciated.