0

Let $f = (f_1, \dots , f_n) : \mathbb{R}^n\rightarrow\mathbb{R}^n$ be twice differentiable. If the derivative matrix $(Df)_x$ is skew-symmetric for all $x$ that is, if $(Df)^T_x = -(Df)_x$ show that $f(x) = Ax+b$ for some (skew-symmetric) matrix $A$ and some $b \in \mathbb{R}^n$.

So I need to show first that $D^2f=0$. Then I can say $Df$ is a constant matrix and then $f(x)-Ax$ is constance and I'm done.

So to show $D^2f=0$:

I know $\begin{pmatrix} \frac{\partial f_1}{\partial x_1} & \cdots & \frac{\partial f_1}{\partial x_n}\\ \vdots & & \\ \frac{\partial f_n}{\partial x_1} & \cdots & \frac{\partial f_n}{\partial x_n} \end{pmatrix}^T = \begin{pmatrix} \frac{-\partial f_1}{\partial x_1} & \cdots & \frac{-\partial f_1}{\partial x_n}\\ \vdots & & \\ \frac{-\partial f_n}{\partial x_1} & \cdots & \frac{-\partial f_n}{\partial x_n} \end{pmatrix}$,

but I'm having trouble seeing how that leads to $D^2f=0$

Burgundy
  • 2,097

2 Answers2

2

First, from the assumption, skew symmetry gives all diagonal entries are zero, $\frac{\partial f_i}{\partial x_i} = 0$. $f_i$ is independent of $x_i$.

Now $\frac{\partial}{\partial x_j} \left (\frac{\partial f_i}{\partial x_j}\right) = - \frac{\partial}{\partial x_i} \left (\frac{\partial f_j}{\partial x_j}\right) = 0$, for $j \ne i$

so $\frac{\partial f_i}{\partial x_j}$ is independent of $x_j$ for all $j \ne i$.

From here, we can draw the conclusion (all second order derivatives of f are zero).

Note: the skey-symmetry $Df$ gives, for $k \ne i, j \ne i$, $$\frac{\partial}{\partial x_k} \left (\frac{\partial f_i}{\partial x_j}\right) = 0$$ as shown in the other answer, omitted here.

runaround
  • 2,210
1

If we assume slightly more, namely that the $f$ is twice differentiable and that that all second order derivatives are continuous, then the following proof is possible.

The goal is to show that \begin{equation} \frac{\partial}{\partial x_k} \left (\frac{\partial f_i}{\partial x_j}\right) = 0. \end{equation} By assumption we have that \begin{equation} \frac{\partial f_i}{\partial x_j} = - \frac{\partial f_j}{\partial x_i} \end{equation} which in popular terms means that we can swap the variable and function index if we remember to change the sign. We will also exploit our stronger assumption which implies that the order of differentiation is irrelevant. We have \begin{multline} \frac{\partial}{\partial x_k} \left (\frac{\partial f_i}{\partial x_j}\right) = \frac{\partial}{\partial x_k} \left (-\frac{\partial f_j}{\partial x_i}\right) = \frac{\partial}{\partial x_i} \left (-\frac{\partial f_j}{\partial x_k}\right) = \frac{\partial}{\partial x_i} \left (\frac{\partial f_k}{\partial x_j}\right) \\= \frac{\partial}{\partial x_j} \left (\frac{\partial f_k}{\partial x_i}\right) = \frac{\partial}{\partial x_j} \left (-\frac{\partial f_i}{\partial x_k}\right) = - \frac{\partial}{\partial x_k} \left (\frac{\partial f_i}{\partial x_j}\right). \end{multline} This will give you that the Jacobian is a constant matrix.

EDIT: As pointed out by @user251257 it is not necessary to assume continuity of the second order derivatives. It is enough that that the Hessian of each $f_i$ is symmetric. This follows from $f$ being twice (Fréchet) differentiable. There is a good proof by @triple_sec of this here : Convex function with non-symmetric Hessian

Carl Christian
  • 12,583
  • 1
  • 14
  • 37