I am curious about tensors and tensor notation and how it translates to common linear algebra stuff that I already know. For instance, we can express an outer product $AA^\top$ as a sum of outer products like so, with $j$ representing the column indices.
$$ AA^\top = \sum_j a_ja_j^\top $$
How would this be expressed in tensor notation? It would seem that we would need to concatenate all of the tensor products along a new dimension and then multiply a vector of ones along the new dimension $i$ like so,
$$ B_{ijk} = [a_1 \otimes a_1, \dots, a_n \otimes a_n]^\top $$
and then take a vector of ones as a covector and multiply it so that it sums along the $i$ dimension.
$$ \mathbb{1}^iB_{ijk} = AA^\top $$
Questions
- Is this correct?
- Is there a better way to express this?
- Are there any good books or resources to recommend to go deeper into learning this sort of thing?
Thanks