2

Let $A = (a_{ij})_{1\le i,j\le n}$ be a square matrix with real entries, with determinant $\det(A) = 0$. To show that for every $\epsilon> 0$ there exists $A'= (a'_{ij})$ another matrix, $|a'_{ij} - a_{ij}|< \epsilon$ for all $i, j$, and $\det A'>0$.

Notes:

  1. Consider the general case of a function $f\colon X \to \mathbb{R}$. Clearly we have $\overline{ \{f>0\}}\subseteq \{f\ge 0\}$. The inclusion may be strict. For instance, consider the polynomial function $f(x,y) = x^2 y$. Now, the only issue is whether $\{f= 0\} \subseteq \overline{ \{ f>0\}}$. This is the case when $X$ manifold, $f$ smooth and $d f\ne 0$ on $\{f=0\}$. ( the hypersurface $\{f=0\}$ is then non-singular).

  2. The similar question "? $\overline{ \{ f < 0 \}} = \{ f\le 0\}$ ?" may be asked. If $f$ is a homogenous polynomial, it is equivalent to the previous one. Also, observe that $\overline{ \{ f < 0 \}} = \{ f\le 0\}$ is equivalent to: $\{f>0\}$ is the interior of $\{f\ge 0\}$.

  3. One could consider the original question $\det \colon X\to \mathbb{R}$, where now $X$ is not the full $M_{n\times n}$ but a certain (affine) subspace (for instance (bordered) symmetric determinants).

Thank you for your interest! Any feedback would be appreciated.

orangeskid
  • 53,909
  • I think you can choose $A' = A + \epsilon I$, compare https://math.stackexchange.com/a/1618027/42969. – Martin R Jun 19 '22 at 19:53
  • For (2), just use the previous lemmas on $-f$. – Sassatelli Giulio Jun 19 '22 at 19:58
  • @Martin R: If you expand $\det (A+ \epsilon I)$, it will be $\epsilon^m( c+ c'\epsilon + \cdots)$. It may be that $m$ is even, but $c< 0$. Some care is needed. – orangeskid Jun 19 '22 at 20:02
  • 1
    $\det$ has no local maxima or minima. I'll call $A\oplus B$ the block diagonal matrix with $A$ first and $B$ second. Consider an invertible matrix $P$ such that $P^{-1}AP=T\oplus B$, with $\det B\ne0$ and $T$ upper triangular and nilpotent (you can do this in many ways, e.g. let $\mu_A(t)=t^k f(t)$ be the minimal polynomial of $A$, with $f(0)\ne0$, and consider the restrictions to $\ker(A^k)$ and $\ker(f(A))$). Call $L_\pm=\operatorname{diag}(\pm1,1,\cdots,1)\oplus 0$ and consider $G_{\epsilon,\pm}=A+\varepsilon PL_\pm P^{-1}\to A$. We have that $G_{\epsilon,\pm}=\pm\epsilon^u \det B$. – Sassatelli Giulio Jun 19 '22 at 20:37
  • 1
    This proves absence of maxima or minima at $\det A=0$. If $\det A\ne0$, or more generally if $\dim\ker A\le1$, then notice that $d_A\det$ is the map $H\mapsto \operatorname{tr}(\operatorname{adj}(A)H)$, which is a non-zero map. – Sassatelli Giulio Jun 19 '22 at 20:45
  • @Sassatelli Giulio: Thank you for your interest! Would you want to write up an answer, it would be useful for everybody. – orangeskid Jun 19 '22 at 20:46
  • 1
    @orangeskid No thanks, I'd rather not. Anybody who wants to write it and take the points can. – Sassatelli Giulio Jun 19 '22 at 20:48
  • @Sassatelli Giulio:Well, you had some mighty good observations, I guess we'll just read them in the comments. Cheers! – orangeskid Jun 19 '22 at 21:13

1 Answers1

2

Let $A= QR$ be the QR decomposition of the given matrix. $Q$ is orthogonal (so that $\det Q = \pm 1$) and $R$ an upper triangular matrix.

Since $\det A = 0$, some diagonal elements of $R$ are zero. By replacing the zero diagonal elements with $\pm \epsilon$ we can achieve that $\det R$ is positive (if $\det Q = 1$) or negative (if $\det Q = -1$) for all $\epsilon > 0$. Let's call the resulting matrix $R_\epsilon$.

Then $\det(A_\epsilon) > 0$ for $A_\epsilon = Q R_\epsilon$, and the coefficients of $A_\epsilon$ depend continuously on $\epsilon$.

If $A$ is also symmetric then one can work with its eigenvalue decomposition $A=Q D Q^T$ instead , where $Q$ is orthogonal and $D$ is a diagonal matrix of the eigenvalues of $A$. Similar as a above, this gives a symmetric matrix $A_\epsilon = Q D_\epsilon Q^T$ with $\det(A_\epsilon) > 0$ for $\epsilon > 0$.

Martin R
  • 113,040