0

Let $x\in\mathbb{R}^{d}$ and $W\in \mathbb{R}^{d\times d}$ so $M(x)=I-2\frac{x^TW^Tx}{\|Wx\|^2}W\in\mathbb{R}^{d\times d}$.

I'd like to show $M(x)$ is invertible for all $x$ under as few constraints on $W$ as possible, e.g., it would be interesting if $M(x)$ is invertible for triangular, symmetric or positive/negative definite $W$.

Observations.

  • Scale invariance: $M(c\cdot x)=M(x)$ for all $c\neq 0$.
  • $x^TW^Tx/\|Wx\|^2$ resembles the generalized rayleigh quotient.
  • Eigenvalues are $\lambda_i(M(x))=1-2\frac{x^TW^Tx}{\|Wx\|^2}\lambda_i(W)$. If $\lambda_i(M(x))\neq 0$ then $M(x)$ is invertible. This happens when the eigenvalues of $W$ satisfies $\lambda_i(W)\neq 1/2 \frac{\|Wx\|^2}{x^TWx}$.
  • Skew symmetric $W^T=-W$ implies $M(x)=I$ which is not interesting.
  • Diagonal $cI$ is implies $M(x)=-I$ which is not interesting.
an4s
  • 3,716
  • Symmetric positive definite doesn't work. For $W=\text{diag}\left(1,\frac12\right)$, $M$ is not invertible for $x=\left(1,0\right)$. That also holds if $W$ is block-diagonal with this form in a two-by-two block. More general, if $\lambda$ is an eigenvalue of $W$ and $W^T$, and $\lambda/2$ is an eigenvalue of $W$, $M$ is not invertible. – Toffomat Sep 04 '20 at 16:17
  • So if $\lambda_j \neq \lambda_i/2$ for all $i\neq j$ then $M(v)$ is invertible when $v$ is an eigenvector. Can we say something about linear combinations of eigenvectors? – Alexander Mathiasen Sep 05 '20 at 13:55

2 Answers2

1

To expand on my comment (this is still only a partial answer):

Take $d=2$ and $W$ diagonalisable with eigenvalues $\lambda_{1,2}$. Clearly, we can take $W$ in diagonal form (i.e. express $M$ in a basis where $W$ is diagonal).

If both eigenvalues are zero (i.e. $W\equiv0$), the expression is ill-defined anyway (already is if $x$ is in the kernel of $W$).

So assume that $\lambda_1\neq0$. Choose $$x=\begin{pmatrix}\sqrt{\frac{\lambda_2^2-2\lambda_1\lambda_2}{\lambda_1^2}}\\1\end{pmatrix}\,.$$ Then $M(x)$ is not invertible, as you can check by direct calculation.

  • Note that this includes my comment: If $\lambda_2=2\lambda_1$, $x_1=0$.

  • Of course, you can rescale $x$ by any constant factor.

  • Thus, there always exists an $x$ for which $M(x)$ is not invertible. This carries over to any matrix whose Jordan form contains $W$.

  • It's too late for me to check the case for only nontrivial Jordan blocks.

  • The bigger remaining question: The eigenvalues may not be real, and even if they are, the vector $x$ may not be. So your (only?) chance of finding an $M$ that is invertible for all $x$ is to lok for some constraints in this direction.

Toffomat
  • 2,245
0

I don't know if this is what you are looking for, but I thought I should at least mention it.

Let $u = x^T W^T \in \mathbb R^{d \times 1}$

Let $v^T = Wx \in \mathbb R^{1 \times d}$

Then $I-2x^T W^T x W = I - 2uv^T \in \mathbb R^{d \times d}$

Hence (see this MS problem),

\begin{align} \det{I-2x^T W^T x W} &= \det{I - 2uv^T} \\ &= 1-2u^Tv \end{align}

So $I-2x^T W^T x W $ is invertible when $W x x^T W^T \ne \frac 12 $