9

This question is related to the previous one. Consider $n$ variables $x_1,x_2,\ldots,x_n$ and the following $n\times n$ matrix:

$$ A=\begin{bmatrix} 1 & \cdots & 1 \\ x_2 + x_3 + \dots + x_n & \dots & x_1 + x_2 + \dots + x_{n-1} \\ x_2{x_3} + x_2{x_4}+ \dots + x_{n-1}x_n & \dots & x_1{x_2} + x_1{x_3}+ \dots + x_{n-2}x_{n-1 } \\ \vdots & \dots & \vdots\\ x_2 x_3 \dots x_n & \dots & x_1 x_2 \dots x_{n-1} \\ \end{bmatrix}. $$ When $i>1$, the element $a_{ij}$ is the sum of all possible products of $i-1$ variables $x_k$'s with distinct indices, except that $x_j$ is not participating in any term on column $j$. Formally, $$ a_{ij}=\sum_{k_1<\cdots<k_{i-1} \text{ and they are } \ne j} x_{k_1}x_{k_2}\cdots x_{k_{i-1}}. $$

Of course, when some $x_i=x_j$, $A$ has two equal columns and it becomes singular, but is this the only possibility for $\det A=0$?

Widawensen
  • 8,172
  • For n=3 I have obtained highly regular expression for determinant i.e ${x_1}^2(x_2-x_3) +{x_2}^2(x_3-x_1)+{x_3}^2(x_1-x_2)$ but even in this case it's hard to analyze it. For n= 2 we have $x_1 - x_2$. Maybe these regularities can be somehow exploited... – Widawensen Nov 28 '16 at 12:19
  • @user1551 ok. User1551 Thank you for the edit.. – Widawensen Nov 29 '16 at 12:30
  • It would have brought you a bit closer to the answer if you tried to expand and re-factorize the polynomial you obtained in the case $n = 3$: $x_1^2 (x_2 - x_3) + x_2^2 (x_3 - x_1) + x_3^2 (x_1 - x_1) = -(x_1 - x_2)(x_2 - x_3)(x_3 - x_1)$. – The Vee Nov 29 '16 at 13:16
  • @TheVee You have made the decisive leap.. thank you.. – Widawensen Nov 29 '16 at 16:23
  • You're welcome. It's a nice observation, I haven't tried building a matrix like this before. – The Vee Nov 29 '16 at 16:25

3 Answers3

4

The Vee is right. This is a Vandermonde determinant, but I think there is a simpler derivation. To stress the dimension of $A$ and its dependence on $x_1,\ldots,x_n$, we denote the matrix by $A_n(x_1,\ldots,x_n)$ instead. Note that when $i,j>1$, we have $$ \begin{align*} a_{ij}-a_{i1}=(x_1-x_j)\sum_{k_1<\cdots<k_{i-2} \text{ and they are } \ne 1,j} x_{k_1}x_{k_2}\cdots x_{k_{i-2}}. \end{align*} $$ Therefore, if we subtract the first column from every other column, we get $$ \begin{bmatrix} 1 & 0\\ \ast & A_{n-1}(x_2,\ldots,x_n)\operatorname{diag}(x_1-x_2,\ldots,x_1-x_n)\ \end{bmatrix} $$ and hence $\det A_n(x_1,\ldots,x_n)=(x_1-x_2)\cdots(x_1-x_n)\det A_{n-1}(x_2,\ldots,x_n)$. Proceed recursively, we obtain $$ \det A_n(x_1,\ldots,x_n)=\prod_{i<j}(x_i-x_j) $$ and the determinant vanishes if and only if $x_i=x_j$ for some two $i,j$.

user1551
  • 139,064
  • @Widawensen Thanks for catching the typo. Yes, the determinant of $A$ is exactly $\prod_{i<j}(x_i-x_j)$. You'll see this if you stop the recursion at size 2 with the 2x2 matrix $A_2(x_{n-1},x_n)=\pmatrix{1&1\ x_{n-1}&x_n}$ remains. – user1551 Nov 29 '16 at 14:48
  • Ok I see. Once again thank you for the fine solution and help in making this question more available. Interestingly if only differences between variables would be a little less than 1 determinant would be very small because there is so many factors $x_i- x_j$, much more than $n$. It probably can explain sensitivity for coefficients of polynomial equation (previous question - is it now possible the answer?)..... Our discussion about zero polynomial was for me also important because generally I'm still learning LA. – Widawensen Nov 29 '16 at 15:27
  • 1
    One of the most interesting things, as I see, in this determinant is that it is solely dependent on differences of values not on their absolute values so if $\mathbf{x} $ is the vector of variables $x_1,x_2,...x_n$ we have $det(\mathbf{A}_n(\mathbf{x}))=det(\mathbf{A}_n(\mathbf{x}+p\mathbf{1}_n))$ for any $p$. – Widawensen Nov 29 '16 at 20:33
2

The determinant of such a matrix is actually the Vandermonde determinant (up to a sign factor, I think, but I may have made a mistake there). You can find it using the same algorithm:

  1. subtract copies of the first column from all others (no change in determinant)

  2. subtract multiples of the first row from each other to cancel out $A_{n,1}$ terms (no change in determinant)

  3. take out the submatrix of second through $n$-th row and column

  4. factorizing all polynomials, notice that first column is divisible by $(x_2-x_1)$, second column by $(x_3-x_1)$, etc.

  5. divide each column by its leading element (determinant obtains a prefactor of $\prod_{k=2}^n (x_k - x_1)$)

  6. find that what's left is exactly of the same form as where you started, just with $x_2, x_3, \ldots, x_n$

  7. repeat.


The crucial step in doing this formally is writing the difference of two matrix elements in the same row as

$$\begin{aligned} A_{k,l} - A_{k,m} &= S_l^{k-1} - S_m^{k-1} = (S^{k-1} - x_l S_l^{k-2}) - (S^{k-1} - x_m S_m^{k-2}) = x_m S_m^{k-2} - x_l S_l^{k-2} = \\ &= x_m (x_l S_{m,l}^{k-3} + S_{m,l}^{k-2}) - x_l (x_m S_{m,l}^{k-3} + S_{m,l}^{k-2}) = x_m S_{m,l}^{k-2} - x_l S_{m,l}^{k-2} = (x_m - k_l) S_{m,l}^{k-2} \end{aligned}$$

where $S^k_{a,b,\ldots}$ is the elementary symmetric polynomial of order $k$ excluding $x_a, x_b, \ldots$.


So to answer your question: the determinant is zero exactly when Vandermonde would be, which in turn is if and only if there are two $x_i$ values that coincide.

The Vee
  • 3,071
1

Consider the polynomial $$P(x) = (x-x_1) \cdots (x-x_n)$$, and $$Q_i(x)\colon = \frac{P(x)}{(x-x_i)}$$ for $i=1, \ldots, n$. For instance $$Q_1(x) = (x-x_2) \cdots (x-x_n) = x^{n-1} - (x_2 + \cdots x_n) x^{n-2} + \cdots + (-1)^{n-1} x_2 \cdots x_{n-1}$$

We have $Q_i(x_k)= 0$ for $k\ne i$, and $Q_i(x_i) = \prod_{k\ne i}(x_i - x_k)$. We conclude that we have the matrix equality

$$ \begin{bmatrix}x_1^{n-1} & x_1^{n-2} & \ldots & 1\\ x_2^{n-1} & x_2^{n-2} & \ldots & 1 \\ \ldots & \ldots & \ldots & 1\\ x_n^{n-1} & x_n^{n-2} & \ldots & 1 \\ \end{bmatrix} \times \\ \times \begin{bmatrix} 1 & \cdots & 1 \\ -(x_2 + x_3 + \dots + x_n) & \dots & -(x_1 + x_2 + \dots + x_{n-1}) \\ x_2{x_3} + x_2{x_4}+ \dots + x_{n-1}x_n & \dots & x_1{x_2} + x_1{x_3}+ \dots + x_{n-2}x_{n-1 } \\ \vdots & \dots & \vdots\\ (-1)^{n-1}x_2 x_3 \dots x_n & \dots & (-1)^{n-1}x_1 x_2 \dots x_{n-1} \\ \end{bmatrix}= \\ = \begin{bmatrix} (x_1 - x_2) \cdots (x_1 - x_n) & 0 & \ldots & 0 \\ 0 & (x_2-x_1) \cdots (x_2-x_n) & \ldots & 0 \\ \ldots & \ldots & \ldots & \ldots \\ 0 & 0 & \ldots & (x_n-x_1) \cdots (x_n - x_{n-1}) \end{bmatrix} $$

Now calculate determinants. Perhaps the signs are a bit off. The first determinant on LHS differs from the standard $V(x_1, \ldots, x_n) = \prod_{j> i}( x_j - x_i)$ by the sign $(-1)^{\binom{n}{2}}$. The second determinant differs from the original one again by the sign $(-1)^{\binom{n}{2}}$. The determinant on RHS is $(-1)^{\binom{n}{2}}\cdot V^2(x_1, \ldots, x_n)$. We conclude that our determinant equals $$(-1)^{\binom{n}{2}}\cdot V(x_1, \ldots, x_n)= \prod_{i < j}(x_i - x_j)$$

Note: the above method provide the inverse of the Vandermonde matrix and is standard.

orangeskid
  • 53,909