3

While I was reading some posts (Definition of a tensor for a manifold, and Tensors as matrices vs. Tensors as multi-linear maps), I encountered the following explanation:

"To give a linear map $V \rightarrow V$ is the same as to give a linear map $V^* \otimes V\rightarrow \mathbb{R}$, assuming we're looking at real vector spaces."

Could anybody kindly explain the above sentense in detail with an example? I am not a math-major, but very much interested in tensor analysis. Thank you in advance.

4 Answers4

3

Not so detailed as you want but I'll give a hint.

It means you have a vector space isomorphism $\mathcal{L}(V^*\otimes V, \mathbb R)\simeq \mathcal{L}(V)$. For seeing this you have to define a linear bijective map $$\Phi:\mathcal{L}(V)\longrightarrow \mathcal{L}(V^*\otimes V, \mathbb R).$$ To each $T\in \mathcal{L}(V)$ you assign $\Phi_T:V^*\otimes V\longrightarrow \mathbb R$ given by: $$\Phi_T(f\otimes v)=f(Tv).$$ So far you have defined $\Phi_T$ only for pure tensors (those of the form $f\otimes v$, $f\in V^*$ and $v\in V$) so you must define to all $V^*\otimes V$. Since you want $\Phi_T$ to be linear you simply extend it by linearity.

It is an exercise showing $\Phi$ is linear and bijective.

PtF
  • 9,655
  • 1
    Thank you very much, PtF and Joe. Well, considering that I am not a math major person, the question that I asked seems to be much more complex than I intially thought. In solid mechanics, most of tensor quanties (stress, deformation gradient, etc.) are defined using a linear transformation $V \rightarrow V$. However, the most popular defintion of tensor that I find on the web uses a linear transformatioin $V \otimes V^* \rightarrow \textbf{R}$. Any other simpler way to get the answer, only using the concept of vector space, and tensor product? – planetmath Mar 03 '14 at 23:48
3

The other answers have already given an "abstract" answer, so I will just make sure you understand what all this means for the basic case $V =\mathbb{R}^n$ (and in finite dimensional linear algebra, that is really all there is anyway!)

Let $L: \mathbb{R}^n \to \mathbb{R}^n$ be a linear map. Let $M$ be the matrix of $L$ with respect to the standard basis. This matrix can act on a column vector by multiplication on the left $v \mapsto Mv = L(v)$, or it can act on row vectors by multiplication on the right $w \mapsto wM$. We can convert a row vector into a column vector or vice versa by transposing. This mapping is called the adjoint of $L$.

Row vectors represent maps $\mathbb{R}^n \to \mathbb{R}$, and so really represent elements of the dual space $V^*$. So the adjoint map really is $L^*: V^* \to V^*$

We get a bilinear map $V^* \times V \to \mathbb{R}$ by the rule $(w,v) \mapsto w(L(v)) = wMv$. In other words, the bilinear map associated to $L$ is given by just taking a row vector and a column vector, and sandwiching the matrix of $L$ in between them.

To be perfectly explicit about this, if $L:\mathbb{R}^2 \to \mathbb{R}^2$ has the matrix $ \begin{bmatrix} a_{11} &a_{21}\\a_{12}&a_{22} \end{bmatrix} $ then $$L\left(\begin{bmatrix} x_1\\ x_2\end{bmatrix}\right) = \begin{bmatrix} a_{11} &a_{21}\\a_{12}&a_{22} \end{bmatrix}\begin{bmatrix} x_1\\ x_2\end{bmatrix}$$

and

$$ L^*\left(\begin{bmatrix} y_1& y_2\end{bmatrix}\right) = \begin{bmatrix} y_1& y_2\end{bmatrix} \begin{bmatrix} a_{11} &a_{21}\\a_{12}&a_{22} \end{bmatrix} $$

The bilinear map $B$ is given by

$$ B\left(\begin{bmatrix} y_1& y_2\end{bmatrix},\begin{bmatrix} x_1\\ x_2\end{bmatrix}\right) = \begin{bmatrix} y_1& y_2\end{bmatrix} \begin{bmatrix} a_{11} &a_{21}\\a_{12}&a_{22} \end{bmatrix}\begin{bmatrix} x_1\\ x_2\end{bmatrix} $$

Observe that I can figure out the linear map (i.e. reconstruct the matrix) just by knowing the action of the bilinear map, since $a_{ij} = B(e_j^\top,e_i)$.

This observation motivates the following inverse construction:

Given a bilinear map $B : V^* \times V \to \mathbb{R}$, define a matrix $M$ by $a_{ij} = B(e_j^\top,e_i)$. Since the $e_i$ and $e_j^\top$ span their respective spaces, we see that these values determine the action of $B$, and moreover produce a linear map $L: V \to V$ whose matrix represents the bilinear form.

Note that my answer implicitly makes use of the standard inner product on $\mathbb{R}^n$: the inner product allows me to construct the natural isomorphism $V \to V^*$ given by $v \mapsto \langle v, \cdot \rangle$, which is the "row vector" associated with $v$.

Hopefully this makes things seem a bit less abstract!

You should also note that a similar story does NOT play out for higher order multilinear maps.

  • 1
    p.s. I am writing a multivariable calculus course which will cover a lot of these concepts (The second derivative is really a bilinear form, the third derivative is really a trilinear form, etc). This will debut on Coursera on Pi day (march 14th). We will develop a lot of linear algebra and multilinear algebra, and apply all of this to understanding multivariable calculus. https://www.coursera.org/course/m2o2c2 – Steven Gubkin Mar 04 '14 at 00:12
2

Hint:

In slightly more generality, which actually makes it a bit easier to tell what's going on, suppose that $V$ and $W$ are finite dimensional $\Bbb{R}$-vector spaces.

The map $$ \begin{align} V^* \otimes W &\to \hom(V, W) \\ f \otimes w &\mapsto \bigg( v \mapsto f(v)w \bigg) \end{align} $$ is a natural isomorphism. (It doesn't depend on choice of bases.)

We can precompose with other isomorphisms, but the duality map in the $\color{red}{\text{middle}}$ is not natural so we have to pick a basis and corresponding dual basis. $$ \left( V^* \otimes W \right)^* = \hom(V^* \otimes W, \Bbb{R}) \color{red}{\to} V^* \otimes W \to \hom(V, W) $$

Sammy Black
  • 25,273
1

Tensor product and $\operatorname{Hom}$, the set of linear maps, are adjoint: $$ \operatorname{Hom}(V^*\otimes V,\mathbb{R})\cong \operatorname{Hom}(V,\operatorname{Hom}(V^*,\mathbb{R})). $$ Now, $\operatorname{Hom}(V^*,\mathbb{R})$ is just $V^{**}$. For finite dimensional $V$, $V^{**}\cong V$ naturally. So, the adjoint relationship becomes $$ \operatorname{Hom}(V*\otimes V,\mathbb{R})\cong \operatorname{Hom}(V,\operatorname{Hom}(V^*,\mathbb{R}))\cong \operatorname{Hom}(V,V). $$

J126
  • 17,451