2

We can motivate the cross product by considering a 3D vector perpendicular to two others. This results in 3 equations in 2 unknowns, i.e. a line of solutions, and...

$\lambda(u_2 v_3 - v_2 u_3, u_1 v_3 - v_1 u_3, u_2 v_1 - u_1 v_2)$

...emerges naturally.

An analogous consideration for the dot product would be projection of one vector onto another. i.e. given unit $\bf{\hat{u}},\bf{\hat{v}}$ and angle $\theta$, we seek a coordinate representation for $\bf{\hat{u}}$ $\cos(\theta)$.

However I can't see an analogous method that plops out $u_1 v_1 + ... + u_k v_k$.

Every method I have seen involves conjuring that expression out of the blue, observing it is a linear operator, etc.

That doesn't seem satisfying to me.

Is there some way to summon $\bf{\hat{u}} \cdot \bf{\hat{v}}$ into existence without conjuring it out of thin air and post-justifying the construction?

PS No dot-product tag?!?

EDIT: I found a solution here -- bizarrely downvoted to -1 while the (poorly phrased) question stands at +28(!)

P i
  • 2,136
  • Geometrically, the magnitude of the cross product also gives the area of the parallelogram spanned by the two vectors, and the triple product gives the (signed) volume of the spanned parallelepiped. The usual definition of the cross product is the one which makes the latter work. – dxiv Jul 10 '16 at 22:06

4 Answers4

8

The dot product is "the one that gives the length of vectors".

Suppose that we want the (squared) length of a vector. The Pythagoras theorem in 3 dimensions gives: $$ |{\bf{v}}|^2 = {v_1}^2 + {v_2}^2 + {v_3}^2\;. $$

If you admit that this is some kind of product of $\bf{v}$ with itself, then you get the scalar product (or the Clifford product). If instead you are not yet completely convinced, look then at: $$ |{\bf{v}+\bf{w}}|^2 = |{\bf{v}}|^2 + 2\,(v_1w_1+v_2w_2+v_3w_3) + |{\bf{w}}|^2\;. $$ This looks like the identity $(x+y)^2=x^2+2xy+y^2$, and the mixed term in the middle corresponds exactly to twice the scalar product.

By the way, there is a tag "inner product space".

P i
  • 2,136
geodude
  • 8,065
2

The dot product is a special case of the more general inner product. Generally speaking, we if we have a scalar field $\mathbb{F}$ and a vector space $V$ over $\mathbb{F}$, the inner product is a type of mapping that assigns two vectors from $V$ a scalar from $\mathbb{F}$ and is denoted $\langle .,.\rangle$ where you replace the dots with vectors. More concisely, we write $$\langle .,. \rangle: V\times V \rightarrow \mathbb{F}.$$ $\mathbb{F}$ is either $\mathbb{R}$, the space of real numbers, or $\mathbb{C}$, the space of complex numbers.

It has some specific properties, which I will list. Let $\mathbb{F}=\mathbb{R}$ and $V$ be a vector space over $\mathbb{R}$. Let $v_1,w_1,v_2,w_2 \in V$ and $\alpha_1,\alpha_2,\beta_1,\beta_2 \in \mathbb{R}$.

  1. Multilinearity: $\langle \alpha_1 u_1+ \alpha_2 u_2 ,v_1 \rangle=\alpha_1 \langle u_1,v_1 \rangle + \alpha_2 \langle u_2,v_1 \rangle$ and $\langle u_1,\beta_1v_1+\beta_2v_2 \rangle=\beta_1\langle u_1,v_1 \rangle + \beta_2 \langle u_1,v_2 \rangle.$
  2. Symmetry: $\langle u,v \rangle= \langle v,u \rangle$
  3. Positive definiteness: $\langle u,u \rangle \geq 0$ and $\langle u,u \rangle=0$ iff $u=0$

Show that for the vector space $\mathbb{R}^n$ over the field $\mathbb{R}$, for $v=(v_1,...,v_n), w=(w_1,...,w_n) \in \mathbb{R}^n$, $\langle v,w \rangle= \sum_{i=1}^{n}v_iw_i$ is a valid inner product. (This is the usual dot product).

If we let $\mathbb{F}=\mathbb{C}$, things change a little bit. Let $v_1,w_1,v_2,w_2 \in V$ and $\alpha_1,\alpha_2,\beta_1,\beta_2 \in \mathbb{C}$.

  1. Multilinearity: $\langle \alpha_1 u_1+ \alpha_2 u_2 ,v_1 \rangle=\alpha_1 \langle u_1,v_1 \rangle + \alpha_2 \langle u_2,v_1 \rangle$ and $\langle u_1,\beta_1v_1+\beta_2v_2 \rangle=\overline{\beta_1}\langle u_1,v_1 \rangle + \overline{\beta_2} \langle u_1,v_2 \rangle.$
  2. Conjugate Symmetry: $\langle u,v \rangle= \overline{\langle v,u \rangle}$
  3. Positive definiteness: $\langle u,u \rangle \geq 0$ and $\langle u,u \rangle=0$ iff $u=0$

Here $\overline{\beta}$ means you take the complex conjugate of $\beta$.

Show that for the vector space $\mathbb{C}^n$ over the field $\mathbb{C}$, for $v=(v_1,...,v_n), w=(w_1,...,w_n) \in \mathbb{C}^n$ ,$\langle v,w \rangle= \sum_{i=1}^{n}v_i\overline{w_i}$ is a valid inner product.

Now the unanswered question is why we bother using this general definition of inner product. The answer lies in the fact that we want to define concepts like "orthogonality" and "projection" mean in vector spaces other than $\mathbb{R}^n$. One application of the inner product is that you can obtain the Fourier series of $f(x)$, which approximates $f(x)$ by sine waves and cosine waves. For more on that, check out my answer to this question: Origin of coefficients of fourier series?

1

From "projection of one vector onto another", when you admit that it is distributive vs. the sum (projection of the sum = sum of the projections), then it does plop out $$ u \cdot v = \left( {u_1 + \cdots + u_k } \right) \cdot v = u_1 \cdot v + \cdots + u_k \cdot v = u_1 \cdot v_1 + \cdots + u_k \cdot v_k $$

G Cab
  • 35,272
0

I suspect that the original motivation of the dot product came from the days before computers as a quick way to compute whether two vectors were orthogonal in higher dimensions. The motivation for using the sum of areas might have come from extending from 2D to higher dimensions. For example, if you diagram $a = (1,1)$ and $b = (1,-1)$ in Cartesian coordinates, then you will see the vectors are clearly perpendicular and $ab = (1 \cdot 1) + (1 \cdot -1) = 1 - 1 = 0$. Generalizing: $ab = \sum_{i=1}^{n} a_{i}b_{i} = 0$ implies orthogonal vectors.