0

Context: I am working with polytopes, I am looking for a general way of computing the normal, tangent, bi-normal, tri-normal... etc. to any $n$-polytope, I am well aware a cross-product of linearly independent vectors representing surfaces from the $n$-polytope could yield this but I was wondering if its possible using GramSchmidt (preferrably since Cross Products in $\mathbb R^n$).

$GS(...)$ = GramSchmidt Operator

Here for example, from this 2-polytope I would love to compute a normal($\vec{n}$), a tangent($\vec{s}$) and a bi-normal($\vec{t}$). I know that $GS(\vec{cb},\vec{ab})$ can give $\vec{s}$ and $\vec{t}$ (I can't think of any other way to get $\vec{n}$ without $\vec{s} \times \vec{t}$). polytope

Here for example, from this 3-polytope I would love to compute a normal($\vec{n}$), a tangent($\vec{s}$), a bi-normal($\vec{t}$) and a tri-normal($\vec{u}$). I think $GS(\vec{cb},\vec{ab},\vec{bf})$ could give $\vec{s}$, $\vec{t}$ and $\vec{u}$ (I can't think of any other way to get $\vec{n}$ without $\vec{s} \times (\vec{t},\vec{u})$). polytope

Please is it possible to obtain an $(n+1)$ orthonormalized basis to an $n$-polytope using GramShcmidt for both examples or how may one correctly apply a skew-symmetric product to the GramSchmidt outputs to get $\vec{n}$?

N.B.

For the 2-polytope ($abcd$) the requirment for $\vec{n}$,$\vec{s}$,$\vec{t}$ is simply that they are orthogonal to each other and share $abcd$ since there could be more than a single combination of vectors possible, the same goes for the 3-polytope ($abcdefgh$) i.e. the requirment for $\vec{n}$,$\vec{s}$,$\vec{t}$,$\vec{u}$ is simply that they are orthogonal to each other and share $abcdefgh$ since there could be more than... I don’t care about the directions of these vectors

Also $abcd$ exists in a 3D environment while $abcdefgh$ exists in a 4D environment thus $\vec{n}$ is perpendicular to $\vec{s}$, $\vec{t}$,$\vec{u}$ as expected

linker
  • 289
  • Not sure I understand the terminology well enough to answer, but my sense is that dot products, or systems of linear equations more generally, are a better (more manageable as well as computationally more efficient) tool than generalized cross products. <> Just trying to put things into math language: Is the goal to start with a linearly independent ordered set of vectors $v_1,\dots,v_k$ in $\mathbf{R}^{n}$ and to construct an orthonormal set $e_1,\dots,e_k$ so that $e_1,\dots,e_m$ spans the same space as $v_1,\dots,v_m$ for each $m$? – Andrew D. Hwang Mar 06 '22 at 12:56
  • To find $n$ in the first setting, we can write $n$ as an ordered tuple of Cartesian components. The conditions $s \cdot n = 0$ and $t \cdot n = 0$ are each linear equations in the components of $n$, and the resulting system can be solved (up to an overall multiplied scalar) by Gaussian elimination. – Andrew D. Hwang Mar 06 '22 at 13:02
  • 1
    @AndrewD.Hwang to your first comment; yes. The goal would be to obtain n linearly independent vectors (spanning the same space) from the surfaces of the n-polytope and then construct an n+1 orthonormal set from the n linearly independent vectors – linker Mar 06 '22 at 14:02
  • 1
    The Gram-Schmidt algorithm does most of this; is the question how to use polytope vertices to get the vectors $v_j$, or how to get the $(n+1)$st vector, or ...? – Andrew D. Hwang Mar 06 '22 at 15:02
  • yes the n+1 vector or $\vec{n}$ – linker Mar 06 '22 at 15:20
  • 1
    An efficient approach is to pick a vector $N$, subtract off the projection $\sum_{j=1}^{n}(e_j \cdot N')e_j$, and normalize if the resulting vector is non-zero. Picking $N$ randomly is very likely to work, but if definiteness is required, start with the standard Cartesian basis in $\mathbf{R}^{n+1}$ and step through them one by one; at least one is guaranteed to work. – Andrew D. Hwang Mar 06 '22 at 16:54
  • 2
    If all you're trying to do is that you have n (n+1) vectors, and you'd like to find a normal vector orthogonal to all of them, simply put the vectors as the rows of a matrix and find its kernel. If they're linearly independent, you should get one vector orthogonal to all of the rows. The Gram-Schmidt algorithm just makes a (linearly independent) basis into an orthogonal basis. – atreju Mar 06 '22 at 18:22
  • Hi @atreju much thanks, your suggestion worked out – linker Mar 14 '22 at 10:26
  • If you care about efficiency, or your matrices are large, I'd recommend trying Andrew's method out -- while 'finding the kernel' is mathematically the description of the problem you have, I don't know a lot about solving the problem numerically, and his idea looks like it would work well and be a lot faster than trying to use elimination or something. – atreju Mar 17 '22 at 05:15

0 Answers0