0

$(X, ||• ||) $ be a $d$ dimensional normed linear space over $K$.

$\beta = \{e_1,e_2,e_3,...,e_d\} \text {be a basis of } X$

Given any, $x\in X$ has a unique representation of the form

$x= x_1 e_1 + x_2 e_2 +... +x_n e_n (x_j \in K,\forall j\in \mathbb{N}_d ) $

Then, $(x_1, x_2, x_3,..., x_d) $ is defined to be the coordinate of $x$ with respect to $\beta$.

Question : $$\text{ Given any sequence} (x^{(n)}) _{n\in \mathbb{N}}\text{ and } x \text{ in } X $$

$(x^{(n)})$$ \text{ converges to } x \text { iff it converges Co-ordinatewise. i.e } (x_j) ^{n}\to (x_j) \space{ } \forall j \in \mathbb{N}_d$

I have already shown my attempt here

Sourav Ghosh
  • 12,997
  • That result would be immediate if your norm were the sup-norm for the chosen basis. The reason it works for other norms too is that all norms on a finite-dimensional real vector space define the same notion of convergence. See Definition 1.3, Theorem 2.1, Lemma 3.1, and Theorem 3.2 in https://kconrad.math.uconn.edu/blurbs/gradnumthy/equivnorms.pdf with $K = \mathbf R$. Note: many references turn Theorem 2.1 into the definition of equivalence of norms, and prove Theorem 3.2 using local compactness of $\mathbf R$ rather than completeness. – KCd Nov 23 '21 at 15:40

1 Answers1

0

Suppose $x^{(n)}_{j} \to x_{j}$, $j=1,...,d$. Let $x := (x_{1},...,x_{d})$ and notice that: $$x^{(n)} - x = (x_{1}^{(n)}-x_{1})e_{1}+\cdots (x_{d}^{(n)}-x_{d})e_{d}.$$ Now, we have: $$||x^{(n)}-x|| \le \sum_{j=1}^{d}||(x_{j}^{(n)}-x_{j})e_{j}|| = \sum_{j=1}^{d}|x_{j}^{(n)}-x_{j}|$$ where I'm assuming the basis is normalized, so the implication follows.

Conversely, because all norms in finite dimensional spaces are equivalent, we can use the Euclidian norm: $$||x||^{2} := |x_{1}|^{2}+\cdots +|x_{d}|^{2}.$$ Suppose $x^{(n)} \to x := (x_{1},...,x_{d})$. Then: $$|x_{j}^{(n)}-x_{j}|^{2} \le \sum_{j=1}^{d}|x_{j}^{(n)}-x_{j}|^{2} = ||x^{(n)}-x||^{2}$$

IamWill
  • 4,025
  • Thank you sir. I have already proved first part and it is matched with your proof. But for the converse part I have missed a result that every finite dimensional linear space is an inner product space i.e admits euclidean norm and then rest follows easily. But sir how to prove converse part for any general norm? Any suggestion – Sourav Ghosh Nov 23 '21 at 16:03