2

Currently I'm studying Tensors applied to Multilinear Forms. For understanding the principles, let's introduce an example of tensors applied to Bilinear Forms using Einstein notation.

Let $B(x,y)$ be a symmetric Bilinear Form such that $B: V \times V \to K$ where $dim(V)=n$ and $x,y$ are taken in the canonical basis.

Recall that $B(x,y)=x^T M_B y$ so the components of $x$ appear as subscript in notation. Now, bilinearity gives us the following reduction: $$B(x,y)=B(\sum_{i=1}^n x^i e_i, \sum_{j=1}^n y^j e_j)=\sum_{ij}^n x_i y^j B(e_i, e_j) = \sum_{ij}^n x_i y^j B^i_j$$

What if $x,y$ are taken in another basis $P=\{p_1,\ldots,p_n\}$?

$$B(x,y) = \sum_{ij}^n x_i y^j B(p_i,p_j) = \sum_{ij}^n x_i y^j B(\sum_{k=1}^n p^k_i e_i, \sum_{l=1}^n p^l_j e_j)$$ $$=\sum_{ij}^n \sum_{kl}^n x_i y^j p_{{i}_k} p^l_j B(e_k,e_l) = \sum_{ij}^n\sum_{kl}^n x_i y^j p_{i_{k}} p^l_j B^k_l$$

Where $p_j^l$ is the $l$-th component of the $j$-th vector of $P$. Moreover, $p_{i_{k}}$ is the $k$-th component of the $i$-th vector of $P$ taken as a covector (row vector).

Do you find the description along with the notation correct? Is there any problem with the example itself?

kub0x
  • 2,117
  • 1
  • 15
  • 25
  • Note that $M_B$ will be symmetric only if $B$ is symmetric. – Berci Jan 11 '21 at 13:18
  • @Berci Thanks, forgot about that. I defined $B$ as a symmetric Bilinear Form without stating it. – kub0x Jan 11 '21 at 13:28
  • 1
    I strongly suggest you begin using Einstein notation. – K.defaoite Jan 11 '21 at 15:10
  • @K.defaoite I considered your suggestion and added the notation for the example. Please comment whether is correct. – kub0x Jan 11 '21 at 20:45
  • @kub0x x is a vector and should have indices up. You can also do away with th summation symbols if it is understood that we sum over repeated indices. – K.defaoite Jan 11 '21 at 21:13
  • @K.defaoite isn't $x$ a co-vector so the component indices are subscript? I mean that $B(x,y)=x^T M_B y$ so we take $x$ as a co-vector there. Thanks for your support! – kub0x Jan 11 '21 at 21:21
  • 1
    You said yourself in your post $x\in V$. It would be correct to write $(x^{\mathrm{T}})i$. As it is written, you are performing an index lowering, i.e $$x_i=g{ij}x^{j}$$ – K.defaoite Jan 12 '21 at 01:00
  • 1
    The sort of matrix multiplication you're talking about in your question I have talked about here – K.defaoite Jan 12 '21 at 01:03
  • @K.defaoite Moreover, your post is very clear and helpful. However, I think there's a typo in the tensor product of vector spaces should be $T: V^r \times (V^*)^s$ so the notation is $T^r_s$. Please check https://en.wikipedia.org/wiki/Tensor_(intrinsic_definition) . You seem to be exchanging both $r,s$. I'd like to hear your opinion :) – kub0x Jan 12 '21 at 10:14
  • Check your use of the indexing on $j$ you have used it in the upper position multiple times. –  Jan 12 '21 at 10:33
  • 1
    How can a bilinear form on $V \times V^*$ be symmetric? The equation $B(x, y) = B(y,x)$ no longer makes sense. – Joppy Jan 12 '21 at 10:40
  • @Joppy I tend to easily confuse when learning notation. I always write $B: V \times V \to K$ but as $x$ is a covector in the equation $B(x,y)=x^T M_B y$ I wanted to state that somehow. Thanks for the correction. – kub0x Jan 12 '21 at 11:45
  • 1
    @kub0x Unfortunately that section on Wikipedia is just wrong. You can check their other article on tensors, or Wolfram Mathworld, or almost any other mathematics source. The convention is to write the dual space first. – K.defaoite Jan 12 '21 at 12:57

0 Answers0