I am not sure I'm asking the question correctly, but it's bugging me too much to let it go.
On this answer to a prior post by me I got stuck in what initially seemed to correspond to a dot product of vectors, and turned out to be a different operation.
My knowledge of linear algebra makes me understand a covariance matrix (as a prototypical case of symmetric semidefinite matrices, or $\bf A^\top A$ forms on my prior post) as a long list of returns for different stock ticker symbols on many days (as an example), each one representing a vector that after being scaled, and dotted with itself and the other stock company returns in the portfolio, is properly placed in a symmetric structure where the variances are in the diagonal, and covariances are off-diagonal.
Now, the operation in the answer is making reference to block matrices, with each element, $a_i$, corresponding to a row vector of norm $1$, such that
\begin{align} A^\top A & = \begin{bmatrix} \vdots & \vdots & \vdots & \cdots & \vdots \\ a_1^\top & a_2^\top & a_3^\top & \cdots & a_n^\top\\ \vdots & \vdots & \vdots & \cdots & \vdots\end{bmatrix} \begin{bmatrix} \cdots & a_1 & \cdots\\ \cdots & a_2 & \cdots \\ \cdots & a_3 & \cdots \\ & \vdots&\\ \cdots & a_n & \cdots \end{bmatrix}\\ &= a_1^\top a_1 + a_2^\top a_2 + a_3^\top a_3 + \cdots+a_n^\top a_n. \end{align}
I don't understand how this outer product (??) would work out to end up with the sum of outer products of vectors of identical indexes.
I know that an example (much like debugging a loop function) would clarify the overall structure of the resulting matrix and the operations involved. Alternatively, even the name of the operation possibly with a link or reference would be good.
Please note that the author explicitly clarifies in the comments, "I'm no longer saying they are orthogonal [the $a_i$ column vectors], only that the projections are orthogonal."