This question arises as a possible step in answering this unsolved question on MSE. Given a unit vector $v \in S^{d-1} \subset \mathbb{R}^d$, I'm looking for an explicit formula for the set $$V = \{ u \in S^{d-1} \mid u \cdot v > 0 \} $$ in spherical coordinates. For $d=2$ we can easily take $v$'s polar coordinate $\theta \in (-\pi, \pi]$ to obtain $$V = (\theta-\pi/2, \theta+\pi/2) $$ modulo $2\pi$. Unfortunately this approach runs into a problem for $d \geq 3$. In spherical coordinates $v = (\theta, \phi)$ with $\theta \in (-\pi, \pi]$ and $\phi \in (-\pi/2, \pi/2]$, consider the following examples:
\begin{align} v_1 &= (0, 0) &\implies&&& V_1 = (-\pi/2, \pi/2) \times (-\pi/2, \pi/2) \\ v_2 &= (\pi/2, 0) &\implies&&& V_2 = (0, \pi) \times (-\pi/2, \pi/2) \\ v_3 &= (0, \pi/2) &\implies&&& V_2 = (-\pi, \pi) \times (0, \pi/2) \\ \end{align}
The first two examples suggest we could do as in the two-dimensional case (subtracting and adding $\pi/2$ to each coordinate), but the last one fails in this respect. Subtracting and adding $\pi/2$ to each coordinate does give the correct set but with respect to a different (equivalent) spherical coordinate system, namely that which takes $\theta \in (-\pi/2, \pi/2]$ and $\phi \in (-\pi, \pi]$. This is inconsistent here, and I haven't found a way to remediate this problem. Any help most welcome!