2

This map has appeared several times on this site such as here, here and here. I thought of writing out a proof for the continuity of this map. Identify $M_n(\Bbb R)$ with $\Bbb R^{n^2}$.

I tried using sequential way to prove the continuity here.

So let $X_n$ be a converging sequence in $M_n(\Bbb R)$ and $||\cdot||$ be max-norm norm on $M_n(\Bbb R)$ i.e. $||A||=\max {(a_{ij})}.$ Then $\forall \; \delta \gt 0 \; \exists N \in \Bbb N$ such that $||X_n - X|| \lt \delta \; \forall \; n \ge N.$ This also means that $\forall \; \delta \gt 0 \; \exists N \in \Bbb N$ such that $||X_n^t - X^t|| \lt \delta \; \forall \; n \ge N.$

Consider $||f(X_n)-f(X)||=||X_n X_n^t-XX^t||=||X_n X_n^t - XX^t + X_nX^t - X_n X^t|| \le ||X_n X_n^t - X_n X^t||+ ||X_nX^t-XX^t||= ||X_n(X_n^t-X^t)||+||X^t(X_n-X)||.$

But I am stuck at the last step. What can I do from here?

Also I am intereseted in knowing other ways to show this.

EDIT : I have found that as per the norm that I have used, $||AB|| \not \le ||A||||B||.$ Take $$A = \left[ \begin{matrix} 1 & 2 \\ 3 & 4 \\ \end{matrix}\right]$$ and $$B = \left[ \begin{matrix} 4 & 5 \\ 6 & 7 \\ \end{matrix}\right].$$ So my way will not work.

This is because I was thinking whether $||X_n(X_n^t-X^t)|| \le ||X_n||||X_n^t-X^t||$ could be done. But my purpose is defeated.

Error 404
  • 6,006

2 Answers2

6

Here's a tricky solution. Consider a function

$$F:\mathbb{M}_n(\mathbb{R})\otimes \mathbb{M}_n(\mathbb{R})\to \mathbb{M}_n(\mathbb{R})$$ $$F(A\otimes B)=AB^{t}$$

where $\otimes$ stands for the tensor product of vector spaces. This function is linear, hence continuous.

Now consider

$$G:\mathbb{M}_n(\mathbb{R})\to\mathbb{M}_n(\mathbb{R})\otimes \mathbb{M}_n(\mathbb{R})$$ $$G(A)=A\otimes A$$

This function although not linear is also continuous:

Proof. Fix a basis $\{e_1,\ldots, e_m\}$ for $\mathbb{M}_n(\mathbb{R})$. It follows that

$$G(v)=G\big(\sum\lambda_i e_i\big)=\sum\lambda_i e_i\otimes \sum\lambda_i e_i=\sum_{i, j}\lambda_i\lambda_j (e_i\otimes e_j)$$

Define $\pi_{i,j}:\mathbb{M}_n(\mathbb{R})\otimes \mathbb{M}_n(\mathbb{R})\to\mathbb{R}$ to be $(i,j)$ projection, i.e. $\pi_{i,j}(\lambda e_i\otimes e_j)=\lambda$ and $0$ elsewehere. It follows that

$$\pi_{i,j}\circ G:\mathbb{M}_n(\mathbb{R})\to\mathbb{R}$$ $$\pi_{i,j}\circ G\big(\sum\lambda_k e_k\big)=\lambda_i\lambda_j$$ In other words $\pi_{i,j}\circ G$ is just a multiplication of two chosen coordinates. This is continuous because the multiplication is. Hence by the definition of product topology $G$ is continuous. $\Box$

Now it is easy to see that $f=F\circ G$ is continuous.

freakish
  • 42,851
  • I am upvoting this answer because it uses something different that I am not aware of. Once I learn tensor products, I will come back to this answer. +1 – Error 404 Sep 27 '17 at 14:53
  • @GEdgar who says it is superior? What does it even mean? – freakish Sep 27 '17 at 14:55
  • Is it more than just a disguised way of saying, "the entries are polynomials". – GEdgar Sep 27 '17 at 14:59
  • @GEdgar with a proof. Im not sure what is the point od your comments. – freakish Sep 27 '17 at 15:03
  • @freakish there are many ways to topologize the tensor product of two topological spaces. You should make explicit which one you are using. If you define it to be minimal so that $V \times W \to V\otimes W$ is continuous, then proof is overkill since the diagonal map is always continuous, and so $G$ can be identified with the composition of two continuous functions. – Andres Mejia Sep 27 '17 at 15:04
  • @AndresMejia Obviously any norm topology (tensor product is finite dimensional) so that multiplication is continuous. Im not sure if they coincide. – freakish Sep 27 '17 at 15:08
4

Unfortunately your norm $||A||=\max {|a_{ij}|}$ is not submultiplikative. Since all norms on $M_n(\Bbb R)$ are equivalent, we can choose a norm which is submultiplikative. Let $ ||⋅|| $ such a norm.

Then we have

$||X_n(X_n^t-X^t)||+||X^t(X_n-X)|| \le ||X_n|| \cdot||X_n^t-X^t||+||X^t|| \cdot ||X_n-X||$.

Can you take it from here ? Observe that $(||X_n||)$ is bounded.

Fred
  • 77,394
  • 1
    Okay. I would like to think about a different norm for a while. Thanks! and yes ,as $||X_n||$ is convergent so it is bounded. – Error 404 Sep 27 '17 at 13:32
  • 2
    An advantage of this proof is: it will also work for operators on infinite-dimensional normed spaces. – GEdgar Sep 27 '17 at 15:05
  • I used this norm : $||A||=\sup {||Ax|| : ||x||=1}$. It is sub-multiplicative! As $||X_n||$ bounded, so it's essentially sequence of bounded linear operators. Also I found that what I am showing is just "jointly continuous" property under multiplication operation. – Error 404 Sep 29 '17 at 16:03
  • @GEdgar Thanks for this generalized comment! +1 – Error 404 Sep 29 '17 at 16:04