Let $\{a_{jk}\}$ be an infinite matrix such that corresponding mapping $$A:(x_i) \mapsto (\sum_{j=1}^\infty a_{ij}x_j)$$ is well defined linear operator $A:l^2\to l^2$. I need help with showing that this operator will be bounded. I guess it means that i need to check if a unit sphere maps to something bounded, so i need to manage to get some inequality on coefficients of matrix that will allow to write a straight sequence of inequalities and get desired bound. But I don't understand how to get bound from operator being well defined.
-
I guess you could substitute $1$ for a finite amount of coordinates and it will show you that the sequence $\sum_{j=1}^N a_{ij}$ is in $l^2$ i. e. $\sum_{i=1}^\infty | \sum_{j=1}^N a_{ij}|^2 < \infty$. I don't have a clue though – Jakobian Aug 11 '20 at 11:28
3 Answers
This follows from the fact that the pointwise limit of bounded operators is bounded, which follows from the uniform boundedness principle:
If a sequence of bounded operator converges pointwise, then it is bounded in norm

- 26,355
It is helpful to consider the 'finite rank' case first:
Lemma. Let $(a_n)$ be a sequence of real numbers such that for every $x = (x_n) \in \ell^2$ we have $Ax := \sum_{n = 1}^\infty a_n x_n$ convergent. Then $A : \ell^2 \to \mathbb R$ is a bounded linear map.
Proof: Suppose $A$ is not bounded. Take any sequence of positive real numbers $(M_n)$. We may find $x_1 \in \ell^2$ with $\Vert x_1 \Vert = 1$ and $Ax_1 > M_1$. We may find a finite subsequence $x_1'$ of $x_1$ satisfying $Ax_1' \geq M_1/2$. It is no restriction to assume all $a_n x_{1, n} \geq 0$. Because $x_1'$ is a subsequence, $\Vert x_1' \Vert \leq \Vert x_1 \Vert \leq 1$. Let $A_1$ be the corresponding subsequence of $(a_n)$. Then $A_1$ is a finite matrix, so it defines a bounded operator.
Because $A - A_1$ is also unbounded, we may find $x_2 \in \ell^2$ with $\Vert x_2 \Vert = 1$ and $(A-A_1)x_2 > M_2$. We may find a finite subsequence $x_2'$ of $x_2$ with $(A-A_1) x_2' > M_2/2$ and all $a_n x_{2,n} \geq 0$, and a corresponding subsequence $A_2$ of $A - A_1$. By construction, we may assume that $x_1'$ and $x_2'$ have disjoint supports, so that $(A-A_1) x_2' = A x_2'$.
Continuing this procedure, we find a sequence $(x_n')$ with disjoint supports, $\Vert x_n \Vert \leq 1$ and $A x_n' > M_n/2$ and $a_m x_{n,m} \geq 0$. By taking the $M_n$ to grow sufficiently fast, the element $x := \sum_{n=1}^\infty \frac1{M_n} x_n'$ is in $\ell^2$. But because the $x_n'$ have disjoint supports, and because $Ax$ is a sum of nonnegative terms, it is clear that $$Ax \geq \sum_{n=1}^\infty \frac1{M_n} A x_n > \sum_{n=1}^\infty \frac12 = \infty \,,$$ so that $A$ is not well-defined. $\square$
Corollary. Let $(a_{ij})$ be a matrix of real numbers with $N$ rows, such that for every $x = (x_j) \in \ell^2$ we have $Ax := \left(\sum_{j = 1}^\infty a_j x_j \right)_i \in \mathbb R^N$ convergent. Then $A : \ell^2 \to \mathbb R^N$ is a bounded linear map.
Proof: Each of the projections of $A$ is bounded by the previous lemma, and this suffices because $\mathbb R^N$ is finite-dimensional. $\square$
Now, for the case of $A : \ell^2 \to \ell^2$.
Proof: Suppose $A$ is not bounded. Take any sequence of positive real numbers $(M_n)$. We may find $x_1 \in \ell^2$ with $\Vert x_1 \Vert = 1$ and $\Vert A x_1 \Vert > M_1$. There then exists a submatrix $A_1$ of $A$ with finitely many nonzero rows, and with $\Vert A_1 x_1 \Vert$ 'very close' to $\Vert A x_1 \Vert$.
By the corollary above, $A_1$ is bounded, so that $A - A_1$ is not bounded. Thus we may find $x_2 \in \ell^2$ with $\Vert x_2 \Vert = 1$ and $\Vert (A-A_1) x_2 \Vert > M_2$, and subsequently a submatrix $A_2$ of $A - A_1$ with finitely many rows and with $\Vert A_2 x_2 \Vert$ 'very close' to $\Vert (A-A_1) x_2 \Vert$.
Continuing this process, we find a sequence $(x_n) \in \ell^2$, with $\Vert x_n \Vert = 1$ and $\Vert A_n x_n \Vert$ 'very close' to $\Vert (A - A_1 - A_2 - \ldots - A_{n-1}) x_n \Vert$ and with $\Vert (A - A_1 - A_2 - \ldots - A_{n-1}) x_n \Vert > M_n$. Note how the $A_n x_n$ have disjoint supports, by construction.
Consider the element $x = \sum_{n=1}^\infty \frac1{M_n} x_n$. When the $M_n$ grow sufficiently fast, this lies in $\ell^2$. Using orthogonality of the $A_n x$ and the triangle inequality applied to finite sums, $$\begin{align*}\Vert Ax \Vert &\geq \sum_{n=1}^\infty \Vert A_n x \Vert \\ &\geq \sum_{n=1}^\infty \left( \frac1{M_n}\Vert A_n x_n \Vert - \left \Vert A_n \sum_{j \neq n} \frac1{M_j} x_j \right\Vert \right) \\ &\geq \sum_{n=1}^\infty \left( \frac1{M_n}\Vert A_n x_n \Vert - \sum_{j < n} \frac1{M_j} \Vert A_n x_j \Vert - \Vert A_n \Vert \left \Vert \sum_{j> n} \frac1{M_j} x_j \right\Vert \right) \\ &\geq \sum_{n=1}^\infty \left( \frac1{M_n}\Vert A_n x_n \Vert - \sum_{j < n} \frac1{M_j} (\Vert (A - A_1 - A_2 - \ldots - A_{j-1}) x_j \Vert - \Vert A_j x_j \Vert) - \Vert A_n \Vert \left( \sum_{j > n} \frac1{M_j^2}\right)^{1/2} \right) \,. \end{align*}$$ Now we choose the $M_n$ and $A_n$ so that this sum is infinite: At every step, take $M_{n}$ very large compared to the norms $\Vert A_j \Vert$ for $j < n$. Given this $M_n$, choose $A_n$ so that $\Vert A_n x_n\Vert$ is extremely close to $\Vert (A - A_1 - A_2 - \ldots - A_{n-1}) x_n \Vert$. It is then clear that the negative terms in the above sum can be made arbitrarily small, to get (for example) $$\Vert Ax \Vert \geq \sum_{n=1}^\infty \frac1{M_n}\Vert A_n x_n \Vert/2 \geq \sum_{n=1}^\infty \frac12 = \infty \,.$$
More concretely, you could take $M_1 = 1$, $M_n> \max(n, 10n \max_{j < n}(\Vert A_j \Vert))$ and $A_n$ such that $\Vert (A - A_1 - A_2 - \ldots - A_{n-1}) x_n\Vert - \Vert A_n x_n \Vert \leq \frac1{100n^2}$. Then $$\begin{align*} &\sum_{n=1}^\infty \left( \frac1{M_n}\Vert A_n x_n \Vert - \sum_{j < n} \frac1{M_j} (\Vert (A - A_1 - A_2 - \ldots - A_{j-1}) x_j \Vert - \Vert A_j x_j \Vert) - \Vert A_n \Vert \left( \sum_{j > n} \frac1{M_j^2}\right)^{1/2} \right) \\ & \geq \sum_{n=1}^\infty \left( \frac9{10} - \sum_{j < n} \frac1{100j^2} - \left( \sum_{j > n} \frac1{100n^2}\right)^{1/2} \right) \end{align*}$$ and this is fine.

- 26,355
-
thats quite a bit complicated while it currently also does not even apply to the general case. Only if $l^2$ is over $\mathbb R$ that proof works. I remember in my seminars that we always worked with $\mathbb C$ so thats quite a huge drawdown. – crush3dice Aug 12 '20 at 07:32
-
Aha. If you're over C, the proof first Lemma needs to be modified a bit. When there's an inequality, put real parts on both sides. You can always assume the real parts are large and positive, by multiplying by -1, i or -i. This seems to do the job. Do you agree? – Bart Michels Aug 12 '20 at 08:20
-
Yeah i think that does it. I just skimmed over it before. I like the row trick to make the $A_nx$ orthogonal. Didnt come to my mind yesterday. Will definietly remember that one. Though i still think that this is too complicated for the question asked. – crush3dice Aug 12 '20 at 09:45
The basic idea is that $A(l^2)$ will not lie in $l^2$ if $A$ was not bounded. The proof:
If $A$ was not bounded then for every $C>0$ there would exist a $x\in l^2$ so that $\|Ax\|>C\|x\|$. This would also be the case for all vectors in $span(x)$. So we can assume $x$ was normalized. Now lets choose
$C_n = n^2$ and find the corresponding normalized $x_n$ so that we have
$$\|Ax_n\| > n^2$$
lets define a new elemt $y\in l^2$ like so:
$$ y = \sum_{n=1}^{\infty} \frac{1}{n} x_n $$
Then we have
$$ \|y\|_2^2 \le \sum_n \frac{1}{n^2} = 2 \implies y\in l^2 $$
lets define $\pi_k$ as the orthogonal projection on $Ax_k$ then we get with pythagoras
$$ \|Ay\| = \frac{1}{k}\|Ax_k + \sum_{n\ne k} Ax_n\| \ge \frac{1}{k}\|Ax_k + \pi_k(\sum_{n\ne k} Ax_n)\| $$
We can then without loss of generality assume by reversing the sign of $x_k$ that
$$ \frac{1}{k}\|Ax_k + \pi_k(\sum_{n\ne k} Ax_n)\| \ge \frac{1}{k}\|Ax_k\| \ge k $$
for every $k\in \mathbb N$ so $\|Ay\| = \infty$. Therefore $A(y) \not \in l^2$ and $A(l^2)\not\subset l^2$. That contradicts the assumption.

- 245
-
1
-
-
-
It seems that you are assuming that the $A x_k$ are pairwise orthogonal, which is certainly not always the case. – Bart Michels Aug 11 '20 at 13:57
-
Also, where in your answer are you using that you are working with $\ell^2$. It seems that the 'proof' you have given would apply to any linear map on any Hilbert space. – Bart Michels Aug 11 '20 at 13:58
-
@BartMichaels no i am not assuming that. Only thing i assume is that $(Ax_k + \pi_k(\sum_{n\ne k} Ax_n)\perp (ID - \pi_k) (\sum_{n\ne k}Ax_n)$ which is true. – crush3dice Aug 11 '20 at 14:02
-
@BartlMichaels it uses that at $y\in l^2$ and $A(l^2)\not\subset l^2$ – crush3dice Aug 11 '20 at 14:02
-