12

Can we invert a vector like we do with matrices, and why?

I didn't see in any linear algebra course the concept of the "vector inverse", and I was wondering if there is any such thing, and if not, why.

Dorgham
  • 378
  • 1
    To ask about an inverse you first need to tell us what operation you have in mind that you want to find the inverse to. – rschwieb Jan 24 '15 at 02:01
  • If you are looking for a multiplicative inverse, you should start thinking about vector multiplication and what the multiplicative identity is (or would be). – The Chaz 2.0 Jan 24 '15 at 02:01
  • Vectors can do linear transformation, right? Then why can't we inverse that transformation just like we inverse any function? – Dorgham Jan 24 '15 at 02:07
  • @MohammadDorgham If you are referring to the linear transformation induced by the scalar product, that transformation is not injective, so it can't be inverted. – Ian Jan 24 '15 at 02:13
  • But we do dot product with matrices too however we are able to inverse them. And what does injective mean – Dorgham Jan 24 '15 at 02:16
  • @MohammadDorgham, what do you mean by matrix's dot product inverses? – janmarqz Jan 24 '15 at 02:22
  • I mean matrix multiplication is like dot product – Dorgham Jan 24 '15 at 02:25
  • Well, any nonzero column vector has a "left inverse" (a vector giving $b^Ta=1$ for nonzero $a$) if that's what you mean. – Algebraic Pavel Jan 24 '15 at 02:28
  • Ok, then maybe be you should consider the vector space plus another operation... this kind of algebraic structure is called algebra, there, some vector has inverses but some not, for example squared matrices with determinant equal to zero – janmarqz Jan 24 '15 at 02:29
  • So, what do you regard as the "vector identity element", what is the binary operation on pairs of vectors, and why do you think a given vector might have an inverse with respect to this operation? – MPW Jan 24 '15 at 02:57

4 Answers4

9

The inverse of an object $a$ over some operation $\mathbb S\ @\ \mathbb S \Rightarrow \mathbb S$ with identity $e$ is the unique object $a^{-1}$ such that $a\ @\ a^{-1} = a^{-1}\ @\ a= e$. $e$ itself must be such that given any object $b$, $b\ @\ e=e\ @\ b=b$.

Vector addition has an obvious inverse: since adding vectors is simply the same as adding their components in whatever basis you feel like, the additive inverse of $v$ has the opposite of those components. But that's simply $-v$.

Scalar multiplication has no sensible vector inverse, because the inputs are necessarily from two different groups.

Dot product doesn't provide inverse elements, because the result of a dot product isn't a vector. While it is possible to come up with a vector such that $a\cdot b=1$, there are an infinite number of such vectors, one for every $c$ such that $a\cdot c=0$.

Cross product doesn't provide inverse elements either: for any non-zero vector $a$, $a\times b = c$ has many possible values for $b$ that give the same $c$.

Linear transformation doesn't provide inverses for the vector: again, the inputs must necessarily be from two different groups, a vector and a matrix.

Componentwise multiplication $(a,b,c)(d,e,f)=(ad,be,cf)$, while it does provide an inverse for many values (reciprocal of each of the components), has huge hunks of non-invertible values (anywhere one of the components is $0$), and the inverse changes based on your choice of basis.

Wedge product produces a result in a different group: the result is an $m+n$-vector where $u$ is an $m$-vector and $v$ is an $n$-vector.

Outer product results in a matrix, which is in a different group.

Tensor product results in a tensor, which is, you guessed it, a different group.

Clifford product This one's interesting. It works like the cross and dot product combined, and so it looks like it's not coming up with the same thing, but it's actually better than both: $v\cdot v$ gives just a scalar part (because the dot product will be non-zero and the cross product will be zero), and if you divide by that for $v^{-1} = \frac{v}{v\cdot v}$ you get a vector in the same direction as $v$ but whose magnitude is the reciprocal, so $v^{-1}\cdot v = 1$, and since we're in a land where scalars and vectors combine, this works perfectly. Also, unlike the infinity that cross and dot products give us, there isn't any freedom because any adjustment to the vector changes the vector part of the product. I don't know clifford algebra terribly well though so I don't know how well this works on things that are already scalar and vector combined...

Those are the ones I can think of. I'm sure others know more operations than I do.

Dan Uznanski
  • 11,025
  • How about a $1:n$ column vector $a$ and a $n:1$ row vector $a^{-1}$ such that $a@a^{-1} = I$ where $I$ is an $n:n$ identity matrix? How do we find such $a^{-1}$? – Confounded Oct 11 '16 at 15:33
  • 1
    If you intend "Multiplying these two vectors together will give the identity matrix", this is impossible for $n>1$; no matter what vectors you pick, your matrix will have rank $1$ (or I guess $0$ if you pass in the zero vector) because each column is a scalar multiple of the column vector, and the identity matrix has rank $n$. $1$-vectors act kinda like numbers in this sense so you'd get for example $[[5]]\times[[1/5]]=[[1]]$ – Dan Uznanski Oct 11 '16 at 20:21
  • Another operation is the geometric product (which is related to complex numbers and quaternions). See https://math.stackexchange.com/q/1270493/472818 – mr_e_man May 04 '22 at 18:07
  • There are rather a few versions of the geometric product apparently! They're all bad news for us though. I added them. – Dan Uznanski May 04 '22 at 19:53
  • 1
    The Clifford/geometric product is invertible, though. The inverse of a vector $v$ is $v/(v\cdot v)$, which is a vector. – mr_e_man May 04 '22 at 21:33
5

In some applications (e.g. extrapolation methods), one often considers what is called the Samelson inverse of a vector:

$$\mathbf v^{(-1)}=\frac{\bar{\mathbf v}}{\bar{\mathbf v}\cdot\mathbf v}=\frac{\bar{\mathbf v}}{\|\mathbf v\|^2}$$

(where the bar denotes complex conjugation), which can be easily shown to satisfy $\mathbf v^{(-1)}\cdot\mathbf v=\mathbf v\cdot\mathbf v^{(-1)}=1$ (the Moore-Penrose conditions in vector garb). (As usual, $\mathbf v\neq\mathbf 0$.)

Some references that use this inverse include this, this, and this.

3

It depends on what you mean by "inverse". Inverse in relation to the addition operation is just the negative of the vector $-v$. Because trivially $v + (-v) = 0$.

Loreno Heer
  • 4,460
  • No I'm not talking about the inverse of the addition. Vectors can do linear transformation, right? Then why can't we inverse that transformation like we inverse any function? – Dorgham Jan 24 '15 at 02:06
  • 1
    What is an example of "vectors doing linear transformation"? – GEdgar Jan 24 '15 at 02:42
0

Perhaps this definition of the inverse vector will help you:

An inverse rectilinear vector ā' is a vector which is co-directed (in the same direction as) a vector ā and differs from it in magnitude according to: $$ |\bar{a'}|=\dfrac{1}{|\bar{a}|} $$ Projections on the coordinate axes of inverse rectilinear vectors are equal according to: $$\quad a'_{x}=\dfrac{a_{x}}{a_{x}^{2}+a_{y}^{2}+a_{z}^{2}};\quad a'_{y}=\dfrac{a_{y}}{a_{x}^{2}+a_{y}^{2}+a_{z}^{2}};\quad a'_{z}=\dfrac{a_{z}}{a_{x}^{2}+a_{y}^{2}+a_{z}^{2}} $$ An example of solving problems with this vector can be found here https://en.wikipedia.org/wiki/Talk:Cross_product#Cross_product_does_not_exist https://doi.org/10.5539/jmr.v9n5p71

In this new definition, with inverse vectors, you can do the same actions as with conventional vectors (addition, multiplication by number, dot product, cross product). In this case, the dot product and the cross product allow us to obtain vector division (an analog of arithmetic division). We can carry the vector through the sign = as in arithmetic. These actions allow solving problems that previously could not be solved. For example, find the force from the torque in a coordinate-vector form.