0

I'm trying to rewrite a projection $P_\vec{v}(\vec{w})$ expressed with dot products $\frac{\vec{v} \cdot \vec{w}}{\vec{v} \cdot \vec{v}}\vec{v}$ to something expressed with a projection matrix $\frac{\vec{v}\vec{v}^T}{\vec{v}^T\vec{v}}\vec{w}$ without using the constituent elements as an intermediate step, reasoning only in abstract (an exercise thrown in a YouTube video about linear algebra).

It's clear to me that since the denominator part is just the definition of dot product in terms of a matrix multiplication $\vec{v} \cdot \vec{v} = \vec{v}^T\vec{v}$, we can extract the denominators as a scalar $\lambda = \frac{1}{\vec{v} \cdot \vec{v}} = \frac{1}{\vec{v}^T\vec{v}}$, yielding $\lambda(\vec{v} \cdot \vec{w})\vec{v}$ and $\lambda(\vec{v}\vec{v}^T)\vec{w}$.

Next, I'm tempted of rewriting as $\lambda\vec{v}(\vec{v}^T\vec{w})$ by associativity of matrix multiplication. However, by definition of dot product, that results to $\lambda\vec{v}(\vec{v} \cdot \vec{w})$, a scalar times vector times a scalar. That seems awkward to me, as scaling a vector is usually defined only from left, like in $\lambda\vec{v}$. On the other hand, matrix multiplication is not commutative, so it's not like I can just change the order willy-nilly. I'd like to just call it a day and rewrite as $\lambda(\vec{v} \cdot \vec{w})\vec{v}$, but is this, formally speaking, nonsense? Is there some trick I'm missing that allows me to rewrite this while abiding by the rules?

  • Associativity is typically how you prove this so you have the right idea. Note that scalars commute so you can move it to the other side after moving the brackets. – CyclotomicField Feb 08 '24 at 13:20
  • If $\vec{a}$ is a column vector, it's also an $n\times 1$ matrix and you can multiply it by a $1\times 1$ matrix $(b)$ where $b$ is scalar and show that $\vec{a}\cdot (b)=b\vec{a}$. – Chad K Feb 08 '24 at 13:24
  • @ChadK By the very definition of vector space any vector can be multiplied by a scalar to give a vector. This has nothing to do with matrix multiplication and it would be absurd to allow the scalar only on the left. – Kurt G. Feb 08 '24 at 14:29
  • @KurtG. - ChadK is correct. What needs to be shown here is that scalar multiplication (on either side) by $b$ is the same as matrix multiplication on the right by $[b]$. – Paul Sinclair Feb 09 '24 at 18:24
  • @PaulSinclair I think I missed the context: matrix multiplication and associativity. Then I agree. In that comment I had in mind abstract vectors and scalars from $\mathbb R$ or $\mathbb C$ or any other field. – Kurt G. Feb 09 '24 at 19:42

1 Answers1

1

ChadK is right.

This statement that you made is - in the strictest sense - incorrect: $$\vec v\cdot \vec v = \vec v^T\vec v$$ The problem? $\vec v\cdot \vec v$ is a scalar, while $\vec v^T\vec v$ is a $1 \times 1$ matrix. Of course, there is an obvious natural identification between a $1\times 1$ matrix $\begin{bmatrix}b\end{bmatrix}$ and the scalar $b$. This is so natural we tend to forget that the two concepts are different.

Where this becomes important in your calculation is here: $$(\vec v \cdot \vec w)\vec v = (\vec v^T\vec w)\vec v$$ If we consider $\vec v^T\vec w$ to be a $1 \times 1$ matrix, this multiplication of $\vec v$ on the left makes no sense. If $\vec v$ is an $n \times 1$ column vector, then you can only multiply it on the left by a matrix with $n$ columns. If $n \ne 1$, then you cannot multiply it on the left by a $1\times 1$ matrix. The right side of the equation is only sensible if you think of $\vec v^T\vec w$ as a scalar instead of a matrix.

On the other hand, your desired goal of $\vec v(\vec v^T\vec w)$ does make sense as matrix multiplication. And since the next step is to apply associativity of matrix multiplication, that is exactly how you want to think of it.

So the thing you need to show is that scalar multiplication of a column vector $\vec v$ by $b$ gives the same result as multiplication of $\vec v$ on the right by $\begin{bmatrix}b\end{bmatrix}$. This is not shown by simply writing the scalar multiplication with the scalar on the right instead of the left. To prove it properly, you need to show that scalar multiplication of column vectors is the same as matrix multiplication on the right by the associated $1\times 1$ matrix.

Paul Sinclair
  • 43,643