5

Find the lengths of the principal axes of the ellipsoid $$\sum_{i \leq j} x_ix_j = 1.$$

-- Arnold, Trivium 85

My solution is below. I request verification, feedback, or alternate approaches (especially ways to simplify).


Solution: In $\mathbb R^n$, the ellipsoid $\sum_{i \leq j} x_ix_j = 1$ has a single axis of length $\sqrt {\frac 8 {n+1}}$ and all other axes length $2\sqrt 2$.

Proof: Recall that if $D$ is a diagonal matrix, then $$\mathbf x^\top D \mathbf x = 1$$ is an ellipsoid in standard position, with axis $i$ of length $\frac 2 {\sqrt {D_{ii}}}$.

Simple multiplication shows that the ellipsoid $\sum_{i \leq j} x_ix_j = 1$ has equation $$\mathbf x^\top S \mathbf x = 1$$ where $$S_{ij} = \begin{cases}1 &\text{ if } i = j\\ \frac 1 2 &\text{ otherwise}.\end{cases}$$

Since $S$ is symmetric, the spectral theorem shows that $S$ has $n$ orthogonal eigenvectors with real eigenvalues, and that $S$ decomposes into $S = DQ$, with $Q$ orthogonal and $D$ diagonal. The diagonal entries of $D$ are the eigenvalues of $S$.

Since $Q$ is orthogonal, it preserves lengths. Consequently, if the eigenvalues of $S$ are $\lambda_i$, then the ellipsoid's axes will have length $\frac 2 {\sqrt {\lambda_i}}$.

Inspection shows that the vector $\mathbf \ell$ with all components $1$ is an eigenvector of $S$ with eigenvalue $\frac {n+1} 2$. Inspection likewise shows that for $1 \leq i < n$, the vectors $\mathbf m_i$ with components $$m_{i_j} = \begin{cases} 1 &\text{ if } j = i \\ 1 - \sqrt n - n &\text{ if } j = n\\ 1 + \sqrt n&\text{ otherwise}\end{cases}$$ are other eigenvectors, each with eigenvalue $\frac 1 2$, which completes the proof.

Remark: The components of $\mathbf m_i$ were determined by solving for $$a + (n-2)b + c = 0 \text{ since } \mathbf m_i \cdot \mathbf \ell = 0 \\ 2ab + (n-3)b^2 + c^2 = 0 \text{ since } \mathbf m_i \cdot \mathbf m_j = 0 \text { when } i \neq j.$$ Is there a simpler way to determine them? The fact that the rows of $S$ are rotations of each other suggests some type of symmetry of its eigenvalues, and we know their sum from $\operatorname{trace} S = n$, but I could not develop this further.

SRobertJames
  • 4,278
  • 1
  • 11
  • 27
  • 2
    The "intended" solution might be the observation that $\sum_{i \leq j} x_ix_j = \frac12(\sum_i x_i)^2 + \frac12\sum_i x_i^2 = \frac12(v\cdot x)^2 + \frac12\Vert x\Vert^2$ with $\Vert v\Vert^2=n$ - which gives the same lengths as you found – user8268 Jan 21 '24 at 19:23
  • @user8268 Would you consider expanding that into a full answer? As written, the comment is a bit too terse for me to follow. – SRobertJames Jan 21 '24 at 19:36

3 Answers3

2

If you are given a matrix $S$ which is such that all of it's rows are permutations of the same elements then it automatically follows that the sum along each row of the matrix is constant.

This means that the vector $(1,...,1)'$ is an eigenvector with eigenvalue equal to the sum of a row. This eigenvalue is $1 + (n-1)/2 = \frac{n+1}{2}.$

It can also be seen that

$$ S = \frac{1}{2}\mathbf 1_{n\times n} + \frac{1}{2}I$$ where $\mathbf{1}_{n \times n}$ is the $n\times n $ matrix containing only $1$'s. So it is clear that

$$ \det(S - \frac{1}{2}I) = 0$$ and it is not hard to see that the eigenspace for $\lambda =1/2$ has dimension $n-1$.

Indeed $S - (1/2)I$ is a matrix where all the rows are constant an equal to $1/2$. Therefore the system $(S-(1/2)I)x = 0$ can be reduced to the equation $$ (1/2)x_1 + ... + (1/2)x_n = 0$$ whose solution is an $n-1$ dimensional subspace.


Alternatively since $S$ is symmetric you know that the eigenvectors corresponding to different eigenvalues are orthogonal. Once you found that $(1,...,1)$ is an eigenvector it suggests that other eigenspaces will be subspaces of $x_1 + ... + x_n = 0$ i.e. subspaces of the hyperplane with normal vector (1,...,1). This can be quite useful to know if you are trying to "guess" an eigenvector. In this specific example though the whole hyperplane is an eigenspace.

Indeed take $x$ such that $x_1 + ... + x_n = 0$ then $$ (Sx)_j = x_j + \frac{1}{2} \sum_{i \neq j} x_i = x_j + \frac{1}{2}(-x_j) = \frac{1}{2}x_j$$ which shows that $x$ is an eigenvector.

Digitallis
  • 3,780
  • 1
  • 9
  • 31
2

Let me expand the comment by @user8268 a bit. First, note that $$\left( \sum_{i=1}^n x_i \right)^2 = 2 \sum_{1 \le i < j \le n} x_i x_j + \sum_{i=1}^n x_i^2.$$ From this, it is straightforward to manipulate that expression to get that $$\sum_{1 \le i \le j \le n} x_i x_j = \frac{1}{2} \left( \sum_{i=1}^n x_i \right)^2 + \frac{1}{2} \sum_{i=1}^n x_i^2.$$ If we set $v := (1, 1, \ldots, 1)$, then we can further rewrite this as $\frac{1}{2} (v \cdot x)^2 + \frac{1}{2} \lVert x \rVert^2$.

Now, if instead we had $v = \alpha e_1$ for some scalar $\alpha$, then this expression would reduce to $\frac{\alpha^2 + 1}{2} x_1^2 + \frac{1}{2} x_2^2 + \cdots + \frac{1}{2} x_n^2 = 1$, which has principal axes $2\sqrt{\frac{2}{\alpha^2+1}}, 2\sqrt{2}, \ldots, 2\sqrt{2}$. However, in fact it is easy to find a rotation which takes $v$ to $\lVert v \rVert e_1$ with $\lVert v \rVert = \sqrt{n}$; and in the rotated situation, we therefore get that the principal axes are $2 \sqrt{\frac{2}{n+1}}, 2\sqrt{2}, \ldots, 2\sqrt{2}$.

As another way of putting that idea, suppose we choose an orthonormal basis $w_1, \ldots, w_n$ of $\mathbb{R}^n$ such that $w_1 = \frac{1}{\sqrt{n}} v = \frac{1}{\sqrt{n}}(1, 1, \ldots, 1)$. Then if $x = y_1 w_1 + \cdots + y_n w_n$, the condition we get is that $\frac{n+1}{2} y_1^2 + \frac{1}{2} y_2^2 + \cdots + \frac{1}{2} y_n^2 = 1$. Therefore, the original ellipsoid has principal axis $2 \sqrt{\frac{2}{n+1}}$ in the direction $w_1$, and principal axis $2\sqrt{2}$ in the directions $w_2, \ldots, w_n$.

1

The first remark is concerning the decomposition of $S$. $S$ should be diagonalized, i.e. written in factored form as $S = V D V^T $, where $V$ is the orthogonal matrix of eigenvectors of $S$ and $D$ is a diagonal matrix of its eigenvalues.

Note that $S = \dfrac{1}{2} I_n + \dfrac{1}{2} U_n $

where $I_n$ is the $ n \times n $ identity matrix and $U_n$ is the $n \times n$ with all entries equal to $1$. It is well-known that the eigenvalues of the $n \times n$ matrix $ B = a I_n + C $ are the eigenvalues of $C$ shifted by the scalar $a$. Now the matrix $U_n$ has one eigenvalue equal to $n$, and the remaining $(n-1)$ eigenvalues are $0$. Therefore, $S$ will have once eigenvalue equal to $\dfrac{1}{2} (n + 1) $ and the remaining eigenvalues equal to $ \dfrac{1}{2} $.

Now, the semi-axes lengths of this hyper-ellipsoid are the square roots of the reciprocal of the eigenvalues, and the axes lengths are just twice the length of the semi-axes. Therefore, the smallest axis length will be $2 \sqrt{ \dfrac{2}{n+1} } = \sqrt{ \dfrac{8}{n+1} } $, while all the other axes lengths will be identical and equal to $ 2 \sqrt{2} $. This confirms the statement in your OP about the axes lengths.

The second remark concerns the eigenvectors of $S$. A simpler way to obtain the unit eigenvectors is as follows:

Define the vectors $\{w_i\} , i= 1, 2, \dots , n $ as follows:

If $i = 1$ then

$ w_i(k) = 1$ , for $k = 1, 2, \dots , n $

Else (that is, for $i = 2, \dots, n $), define

$ w_i (k) = 1 , \ 1 \le k \le (i-1) $

$ w_i (i) = -(i-1) $

$ w_i (k) = 0 , \ i \lt k \le n $

Now, the unit vectors $v_i$ are computed by normalizing $w_i$

Example: Suppose $n = 5$, then

$ w_1 = [1, 1, 1, 1, 1]^T $

$ w_2 = [1, -1, 0, 0, 0]^T $

$ w_3 = [1, 1, -2, 0, 0]^T $

$ w_4 = [1, 1, 1, -3, 0]^T $

$ w_5 = [1, 1, 1, 1, -4]^T $

And the $v_i$'s are the normalized version of the $w_i$'s.

i.e.

$ v_i = \dfrac{ w_i } { \| w_i \| } $

Hosam Hajeer
  • 21,978