3

There is an answer of the question "Why is the tensor product important when we already have direct and semidirect products?" which states that

they allow you to study certain non linear maps (bilinear maps) by transforming them first into linear ones, to which you can apply linear algebra;---Mariano Suárez-Álvarez♦

Could anyone give me some concrete (and basic as possible, without using category theory) examples to show this application?

Please don't just show that there is a linear map which is induced by the multilinear map. Please show that how can we use the linearity of the linear map to gain some information or properties of the multilinear map.

bfhaha
  • 3,721

2 Answers2

2

In the cartesian plane geometry how do you relate the calculation that spawn two vectors? the answer is the bilinear map determinant.

Similar happens with the calculation of the volume spawn by three vectors in the 3d cartesian space.

janmarqz
  • 10,538
  • You mean that we study the determinant (multilinear) by treating it as a linear map? https://math.stackexchange.com/questions/2327143/prove-some-properties-of-determinant-by-the-universal-property-of-the-tensor-pro – bfhaha Jun 19 '17 at 16:16
  • the determinant has features of employing linearity in each of its entries and has geometrical interpretations, but studying the determinant only because of this, one falls short. – janmarqz Jun 19 '17 at 16:27
  • It seems doesn't answer the question. But thank you. The example I want is a theorem about a multilinear map and we prove it by using the linear map which is induced by the multilinear map. – bfhaha Jun 19 '17 at 16:37
2

Consider the Proposition 2.1.7.1 from Geometry and Complexity Theory- Landsberg.

Proposition 2.1.7.1: $\underline{R}(T) \geq \text{rank}(T_A)$

$T$ is a tensor in $A \otimes B \otimes C$, where $A,B,C$ are finite dimensional complex vector spaces. $\underline{R}(T)$ is the border rank of $T$ and $\text{rank}(T_A)$ is the rank of the linear map $T_A:A^\ast \to B \otimes C$ induced by the tensor. You can find all these definitions in the text, which is available on the internet.

Now you can ask why do we bother studying border rank?, but the thing is, the border rank is crucial to understand the complexity of matrix multiplication, which is an important open problem in mathematics. You can read something about the subject here.

Integral
  • 6,554
  • border rank and rank is the same for rank two, but not rank $\ge3$ tensors: https://en.wikipedia.org/wiki/Tensor_rank_decomposition#Border_rank – janmarqz Jun 19 '17 at 17:40
  • 1
    Thanks. I am reading the material. http://www.math.tamu.edu/~jml/simonsclass.pdf – bfhaha Jun 19 '17 at 17:47
  • @janmarqz The tensor of $2 \times 2$ matrix multiplication has rank 7, and this value is higher for $n \times n$ matrix multiplication when $n \geq 3$. – Integral Jun 19 '17 at 22:09
  • any square matrix represents a rank 2 tensor @Integral – janmarqz Jun 19 '17 at 23:49
  • For anyone reading this in the future, the rank definition used by janmarqz is not the same I'm using. The rank definition I'm using is the same used in linear algebra, but using the SVD as starting point. To be precise, let $T:A \to B$ be a linear map and consider $T$ as a matrix. The SVD of $T$ gives us a decomposition $T = U \Sigma V^\ast = \sum_{i=1}^R \sigma_i u_i v_i^\ast$, where $u_i, v_i$ are the $i$th columns of $U$ and $V$, respectively. The number $R$ of factors in the decomposition is unique and is the rank of $T$. This definition of rank agrees with all other definitions. – Integral Dec 18 '18 at 17:34
  • Given vectors $a \in A, b \in B, c \in C$, we say the tensor product $a \otimes b \otimes c \in A \otimes B \otimes C$ has rank 1. Let $T$ is a tensor in the same space and suppose we can write $T = \sum_{i=1}^R a_i \otimes b_i \otimes c_i$ for some vectors. If this $R$ is the least number of terms such that this decomposition of rank 1 terms exists, we say $T$ has rank R. – Integral Dec 18 '18 at 17:38
  • Finally, I want to add that considering the another definition is just a bad choice of words. The word rank in the world of linear algebra and matrices is already used to mean one exact thing, which is the definition I gave with the SVD, or the dimension of the range of the linear map, or the number of linearly independent columns of the matrix. These three definitions agree.

    Saying that a square matrix always represents a rank 2 tensor is not health. If your matrix happen to have 5 linearly independent columns, are you going to say it is a rank 2 tensor with rank 5? Just use another word

    – Integral Dec 18 '18 at 17:44