3

If we consider a rank 1 tensor $T \in (\mathbb{R}^n)^\ast \otimes \mathbb{R}^n$, then there exist a linear functional $f \in (\mathbb{R}^n)^\ast$ and a vector $u \in \mathbb{R}^n$ such that $T = f \otimes u$. There are two canonical ways to interpret $T$:

1) As a bilinear map $T:\mathbb{R}^n \times (\mathbb{R}^n)^\ast \to \mathbb{R}$ such that $T(x,g) = f(x)\cdot g(u)$.

2) As a linear map $T:\mathbb{R}^n \to \mathbb{R}^n$ such that $T(x) = f(x) \cdot u$.

With respect to the second interpretation, we know it is possible to realize $T$ as a $n \times n$ matrix. In fact, there is a unique vector $v \in \mathbb{R}^n$ which we can associate with $f$, such that $f(x) = v^Tx$ for all $x \in \mathbb{R}^n$. Then the matrix associated to $T$ is $uv^T$, so $T(x) = uv^T \cdot x$ for all $x \in \mathbb{R}^n$. All this is standard in tensor theory.

Also it is possible to consider a generic $T \in V_1 \otimes \ldots \otimes V_k$ in coordinates (all $V_i$ are real vector spaces with dimension $\dim V_i = n_i$). Given basis $e_{i1}, \ldots, e_{in_i} \in V_i$ for each $i=1 \ldots k$, we define $t_{i_1 i_2\ldots i_k} = T(e_{1i_1}, e_{2i_2}, \ldots, e_{ki_k})$. The $k$-dimensional matrix formed by the values $t_{i_1 i_2 \ldots i_k}$ is the tensor $T$ in coordinates.

Now, back to the case $T \in (\mathbb{R}^n)^\ast \otimes \mathbb{R}^n$ with rank 1, the representation $T = uv^T$ as a matrix is clearly the coordinate representation of $T$ in the canonical basis and its corresponding dual basis. The problem is that I tried to construct this explicitly and I always get the transpose of $T$. Let me clarify with a example.

Let $T \in (\mathbb{R}^2)^\ast \otimes \mathbb{R}^2$ such that $$T = \left[\begin{array}{cc} 1 & 2 \end{array}\right] \otimes \left[ \begin{array}{c} 3\\ 4\\ \end{array}\right], $$ where $[1 \ 2]$ stands for a linear functional in $(\mathbb{R}^2)^\ast$ (considered as a row vector). Since $e_1=(1,0), e_2=(0,1)$ is the canonical basis of $\mathbb{R}^2$, its dual basis is given by $dx_1, dx_2$, where $dx_1(x,y) = x$ and $dx_2(x,y) = y$.

Supposedly the coordinates of $T$ are $$t_{11} = T(e_1, dx_1) = 3$$ $$t_{12} = T(e_1, dx_2) = 4$$ $$t_{21} = T(e_2, dx_1) = 6$$ $$t_{22} = T(e_2, dx_2) = 8.$$

Then we conclude the matrix representation of $T$ is

$$\left[\begin{array}{cc} 3 & 4\\ 6 & 8\\ \end{array}\right],$$ which is not correct since

$$T = \left[ \begin{array}{c} 3\\ 4\\ \end{array}\right] \left[\begin{array}{cc} 1 & 2 \end{array}\right] = \left[ \begin{array}{cc} 3 & 6\\ 4 & 8\\ \end{array}\right].$$

Why is that happening and how can I fix? Thank you very much.

Integral
  • 6,554
  • taking $T\in({\Bbb R}^n)^\otimes{\Bbb R}^n$ is a rank two (mixed) tensor. Only elements from $({\Bbb R}^n)^$ or ${\Bbb R}^n$ are rank one tensors. – janmarqz Jun 19 '17 at 15:51
  • Can't see how this is helpful. – Integral Jun 19 '17 at 16:38
  • see my comment at https://math.stackexchange.com/questions/2327650/examples-of-studying-multilinear-maps-via-linear-maps/2328693#2328693 to clear doubts about the rank of a tensor – janmarqz Jun 19 '17 at 17:46
  • @janmarqz I'm not concerned about the rank, but the representation of the tensor as a matrix. Did you read my question? I really fail to see how your comments address my doubts. There is no mention at all about rank two tensor, everything here is rank one. – Integral Jun 19 '17 at 22:05
  • well, what definition on tensor's rank are you using? – janmarqz Jun 19 '17 at 23:46
  • Let $V,W$ be vector spaces and let $v \in V, w \in W$ be two vectors. Then $v \otimes w \in V \otimes W$ is a rank 1 tensor by definition. Now let $T \in V \otimes W$ be a generic tensor, the rank of $T$ is the minimum $r \in \mathbb{N}$ such that $T$ can be written as a sum of $r$ rank 1 tensors. – Integral Jun 19 '17 at 23:51
  • i use http://mathworld.wolfram.com/TensorRank.html – janmarqz Jun 19 '17 at 23:57

1 Answers1

2

You need to see $T$ as an element of $\mathbb R^2 \otimes \mathbb {R^2}^*$ for the right order of indices. In a matrix of a linear operator the first index corresponds to the target space (column vectors) and the second to the dual of the source (row vectors).