4

Consider the simple stick-breaking function $$f : [0, 1]^2 \to S^3 : (v_1, v_2) \mapsto (x_1, x_2, x_3) = \big(v_1, (1 - v_1) v_2, (1 - v_1) (1 - v_2)\big),$$ where $S^N = \left\{(x_1, x_2, \ldots, x_N) \mid \sum_i x_i = 1, \forall i : x_i \geq 0\right\}$ is the $N$-dimensional probability simplex.

The inverse of this function should be $$f^{-1}(x_1, x_2, x_3) = \Big(x_1, \frac{x_2}{1 - x_1}\Big)$$ However, there is an equivalent formulation that makes use of the fact that $x_1 + x_2 + x_3 = 1$ $$f^{-1}_\mathrm{eq}(x_1, x_2, x_3) = \Big(x_1, \frac{x_2}{x_2 + x_3}\Big)$$

Both of these formulations should represent the same function, but if we compute the Jacobians for both functions, we get $$\begin{align*} \mathcal{J}_{f^{-1}}(x_1, x_2, x_3) &= \begin{pmatrix}1 & 0 & 0 \\ \frac{x_2}{(1 - x_1)^2} & \frac{1}{1 - x_1} & 0 \end{pmatrix} \\ \mathcal{J}_{f^{-1}_\mathrm{eq}}(x_1, x_2, x_3) &= \begin{pmatrix}1 & 0 & 0 \\ 0 & \frac{x_3}{(x_2 + x_3)^2} & \frac{-x_2}{(x_2 + x_3)^2} \end{pmatrix} = \begin{pmatrix}1 & 0 & 0 \\ 0 & \frac{1}{1 - x_1} - \frac{x_2}{(1 - x_1)^2} & \frac{-x_2}{(1 - x_1)^2} \end{pmatrix}, \end{align*}$$ which seems to suggest that there is no unique Jacobian for $f^{-1} = f^{-1}_\mathrm{eq}$.

Note that the difference between the second rows in the Jacobian is a constant. For higher-dimensional variants of these functions, I noticed that this constant offset is different for every row.

I am aware that constants are unique up to a constant term, but I thought derivatives would be unique (cf. this post with an answer for single-output functions). Therefore, I started wondering: are Jacobians actually unique? If yes, how should the example above be interpreted? If not, is there some specific notion of uniqueness like "unique up to ..."?

Kroki
  • 13,135

2 Answers2

2

For simplicity let $g = f^{-1}$ and $h = f^{-1}_{eq}$

Now let $u\in \mathbb R^3$ such that $\sum_{i=1}^3 u_i = 0$

\begin{align} J(g)u &= \begin{bmatrix}u_1\\ u_1 \frac{x_2}{\left(1-x_1\right)^2} + u_2 \frac1{1-x_1}\end{bmatrix}\\ &= \begin{bmatrix}u_1\\ -(u_2 + u_3) \frac{x_2}{\left(1-x_1\right)^2} + u_2 \frac1{1-x_1}\end{bmatrix}\\ &= J(h)u \end{align}

Now why I choose $\sum_{i=1}^3 u_i = 0$ simply because I want $x + u\in S^3$.

Kroki
  • 13,135
  • Am I correct to read this as "the Jacobian must not be unique, but the (total) derivative is unique"? – Mr Tsjolder from codidact Jun 01 '23 at 20:29
  • I am not sure to understand what do you mean by the total derivative? But the linear map is unique on the set of possible directions. – Kroki Jun 01 '23 at 20:36
  • With total derivative, I mean $\boldsymbol{u} \mapsto \mathcal{J}(\boldsymbol{x}) \cdot \boldsymbol{u}$, which is probably what you refer to as "linear map". Another way to put it would be: "The matrix representation for the linear map is not unique, but the linear map is." – Mr Tsjolder from codidact Jun 02 '23 at 07:35
  • The linear maps are not the same. They induce the same map on the possible directions. – Kroki Jun 02 '23 at 13:31
  • The maps are not the same, but the induced maps are. What are the linear maps and what are the induced maps? I'm confused... – Mr Tsjolder from codidact Jun 02 '23 at 15:11
  • Don't be confused! A linear map is just a function from space $\mathcal X$ to another space $\mathcal Y$. When you say that two maps are the same it means that they are the same for all elements in $\mathcal X$. However, you can have two linear maps that the same only on a subspace of $\mathcal X$ we say that the y induced the same maps on that subspace. – Kroki Jun 02 '23 at 15:47
  • This might get me on the way, but I have this feeling I need some more time to process everything to really grasp what is going on. – Mr Tsjolder from codidact Jun 03 '23 at 10:57
2

You say that the two expressions represent the same function: but that is true only on $x_1+x_2+x_3=1$. There is no reason at all to suppose that their Jacobians coincide, even on that plane.

I find it helpful to consider a far simpler example.

The two functions $g_1(x)=x$ and $g_2(x)=1$ coincide at $x=1$: yet $g_1'(1)=1\ne 0=g_2'(1)$.

ancient mathematician
  • 14,102
  • 2
  • 16
  • 31
  • Is it really that simple? Aren't different formulations that coincide not equivalent if domain and co-domain are properly specified? In your simple example, this would mean that $g_i : {1} \to {1},$ which seems quite different from a mapping between high-dimensional vector spaces. Also, Youem's answer seems to suggest that the derivative is actually unique, but the Jacobian isn't. – Mr Tsjolder from codidact Jun 02 '23 at 08:00
  • 1
    If you propose to change the domain of your functions to the 2-dimensional surface of the plane, then you've got to parametrize that plane with two parameters and then the Jacobian(s) will be $2\times 2$ and there will be no problems, the usual product rule for Jacobians will apply and everything will be reconcilable. But that's not what you've done, you've worked out the Jacobians on the whole $\mathbb{R}^3$ and so got these different $2\times 3$ Jacobians. – ancient mathematician Jun 02 '23 at 08:33
  • I'm afraid I'm too stupid to understand what you are trying to say. Are the Jacobians that I have computed meaningless? – Mr Tsjolder from codidact Jun 02 '23 at 15:09
  • As you have not really given two functions, merely two expressions, I don't know for sure what they mean. If you mean them to give a function defined only on the plane, then I think they have to be multiplied by $3\times 2$ jacobians of the parametrisation. Once you do that all will be well I reckon. That is I think what @youem has done. – ancient mathematician Jun 02 '23 at 15:20
  • Do you mean $f^{-1} : S^3 \to [0,1]^2$ and $f^{-1}_\mathrm{eq} : S^3 \to [0,1]^2$ when talking about the two expressions? I thought these signatures are implied by the fact that they both represent a way to compute the inverse of $f$. Or do you need some other form of additional information? With parameterization, do you mean something like a basis for $S^3$ (assuming it would be a vector space)? – Mr Tsjolder from codidact Jun 03 '23 at 10:40