Can we always have $(r*a^{T})*b = (b^{T}*a) *r$ where $r, a$ and $b$ are column vectors for which both of the above products are defined? Also, the length of $r$ can be different than the length of $a$ and $b$.
2 Answers
Yes, this is always true... sorrrrt of, given that $a$ and $b$ are of the same length, but you must be careful!
In $(b^T * a) * r$, the last multiplication is that of a scalar by a matrix (a vector in this case), and it's not at all the same operation as the one indicated by the first star, which is multiplication of a matrix by a matrix. So, in some sense this is almost a coincidence.
Here's the trick. Matrix multiplication is associative, so
$$(r * a^T) * b = r * (a^T * b).$$
Notice that $a^Tb$ is a $1 \times 1$ matrix. It's tempting to think that it is a scalar, but those two are quite different things.
Since $r$ is a column vector, which is a $N \times 1$ matrix, we can multiply it by the $1 \times 1$ matrix obtained from $a^Tb$, and it's perfectly legal. It just so happens that for a column vector (and nothing else mind you!), multiplying it by a $1 \times 1$ matrix on the right is both legal and equivalent to multiplying it by the scalar which is the single element of that matrix.
Now, to the second expression. Again, if we interpret all $*$ operators as matrix multiplication (what we did for the first expression), this is not even admissible! You will be multiplying a $1 \times 1$ matrix by a $N \times 1$ and this operation is undefined, so we may as well stop here. However, if you treat $a^T * b$ as a scalar (which most software like MATLAB would do automatically), then multiplying it by $r$ would be equivalent to what happened above.
So, the theoretical answer is that the second expression is invalid. But the more sloppy practical answer is that they indeed will always be equivalent, provided that you treat $a^T * b$ as a scalar in the second expression.

- 4,028
Let $$ \vec{r} = \begin{bmatrix} r_1 \\ r_2 \\ \vdots \\ r_n \end{bmatrix}, \; \vec{a} = \begin{bmatrix} a_1 \\ a_2 \\ \vdots \\ a_n \end{bmatrix}, \; \vec{b} = \begin{bmatrix} b_1 \\ b_2 \\ \vdots \\ b_n \end{bmatrix} $$ then we have: $$ \begin{eqnarray*} \left( \vec{r} \cdot \vec{a}^T \right) \cdot \vec{b} & = & \left( \begin{bmatrix} r_1 \\ r_2 \\ \vdots \\ r_n \end{bmatrix} \cdot \begin{bmatrix} a_1 & a_2 & \cdots & a_n \end{bmatrix} \right) \cdot \begin{bmatrix} b_1 \\ b_2 \\ \vdots \\ b_n \end{bmatrix} \\ & = & \begin{pmatrix} r_1a_1 & r_1a_2 & \cdots & r_1a_n \\ r_2a_1 & r_2a_2 & \cdots & r_2a_n \\ \vdots & \vdots & \ddots & \vdots \\ r_na_1 & r_na_2 & \cdots & r_na_n \end{pmatrix} \cdot \begin{bmatrix} b_1 \\ b_2 \\ \vdots \\ b_n \end{bmatrix} \\ & = & \begin{bmatrix} r_1a_1b_1 + r_1a_2b_2 + \cdots + r_1a_nb_n \\ r_2a_1b_1 + r_2a_2b_2 + \cdots + r_2a_nb_n \\ \vdots \\ r_na_1b_1 + r_na_2b_2 + \cdots + r_na_nb_n \end{bmatrix} \end{eqnarray*} $$ whereas $$ \begin{eqnarray*} (\vec{b}^T \cdot \vec{a}) \cdot \vec{r} & = & ( b_1a_1 + b_2a_2 + \cdots b_na_n ) \cdot \begin{bmatrix} r_1 \\ r_2 \\ \vdots \\ r_n \end{bmatrix} \\ & = & \begin{bmatrix} r_1a_1b_1 + r_1a_2b_2 + \cdots + r_1a_nb_n \\ r_2a_1b_1 + r_2a_2b_2 + \cdots + r_2a_nb_n \\ \vdots \\ r_na_1b_1 + r_na_2b_2 + \cdots + r_na_nb_n \end{bmatrix} \end{eqnarray*} $$ So naturally the two sides are equal. As @Phonon mentioned above, this follows from a combination of the fact that for column vectors $a,b$ we have $a^Tb = b^Ta$ and that matrix multiplication is associative. However one needs to consider $a^Tb$ to result in a scalar; that is rather than a $1\times1$ matrix we need to consider it to be a scalar number and then use scalar multiplication really. So using a simply symbolic argument we can see that $$ (r * a^T) * b = r * (a^T * b) = r * (b^T * a) = (b^T * a) * r $$ If you were to consider $(a^T * b)$ to indeed be a $1\times1$ matrix we wouldn't have your identity but we would have $$ (r * a^T) * b = r * (b^T * a) $$

- 5,781