20

Is there a simple way to show that a positive definite matrix must be Hermitian?

I feel there is a long drawn out proof of this to be had by taking unit vectors and applying the positive definiteness property, and brute forcing it.

But is there some simple clever proof why a positive definite matrix is necessarily Hermitian?

pad
  • 3,017
  • 5
    "Positive definite" is not a word that should be applied to matrices in the first place (it should be applied to sesquilinear forms). To the extent that it applies to matrices, it should only apply to Hermitian ones. – Qiaochu Yuan Dec 29 '12 at 21:56
  • One usually considers only Hermitian definite positive matrices. Indeed, in most cases this property is included in the definition. I think that this is due to the fact that only for Hermitian matrices the property of being definite positive is equivalent to the property of having only positive eigenvalues. – Giuseppe Negro Dec 29 '12 at 21:56
  • i know the questtion is phrased poorly. but i did look at it with forms and couldn't see any simple connection either. if someone could explain it with forms that would be great too... – pad Dec 29 '12 at 22:18
  • 3
    @pad I am not sure if there is any cleverer way out. I guess all the proof will need to prove if $z^* B z = 0$ is true for all $z \in \mathbb{C}^n$, then $B$ has to be zero. Using unit vectors (or) other special vectors, I believe is the easy and right approach. –  Dec 29 '12 at 22:26
  • it's not immediately obvious to me why positive definite implies a real spectrum? – pad Dec 29 '12 at 22:49
  • 2
    @Jose: this is false. For example, any nilpotent operator has real spectrum, but nonzero nilpotent operators are never self-adjoint. The correct statement is that a normal operator is self-adjoint if and only if its spectrum is real. – Qiaochu Yuan Dec 29 '12 at 22:54
  • http://math.stackexchange.com/q/57350. There is another approach to showing that if $\langle x,Ax\rangle=0$ for all $x$, then $A=0$, with the possible advantage that it works in infinite dimensions. – Jonas Meyer Dec 30 '12 at 00:34
  • https://math.stackexchange.com/questions/561636/show-that-a-positive-operator-on-a-complex-hilbert-space-is-self-adjoint – YouJiacheng Nov 28 '23 at 08:30

2 Answers2

14

You don't even need positive definiteness (or semi,negative etc definitiveness for that matter)

Only thing you need is that $x^*Ax \in R \; \forall x \in C$ which is ofcourse true when we compare this value with $0$ while defining such matrices.

So let $x^*A x\in R \; \forall x\in C$ (and NOT just $R$)

I will show $A$ is hermitian

$x^*Ax= \langle Ax,x\rangle=\langle x,A^*x\rangle = \overline{\langle A^*x,x\rangle} = \langle A^*x,x \rangle \rightarrow \langle (A-A^*)x,x\rangle = 0 \; \forall x\in C$

Claim: if $\langle Bx,x\rangle = 0 \; \forall x\in C \rightarrow B=0$

Proof: For arbitrary $y\in C$, $\; \langle B(x+ky), x+ky\rangle = \bar{k}\langle Bx,y \rangle + k\langle By,x \rangle$

Now set $k=1$ and $k=\iota$ to get two equations, solve them to get $\langle Bx,y\rangle =0 \; \forall y\in C \rightarrow Bx= 0 \; \forall x\in C \rightarrow B=0$

chesslad
  • 2,533
10

As Marvis says in the comments, the problem reduces to showing that if $V$ is a finite-dimensional complex inner product space and $A : V \to V$ an operator such that $\langle v, Av \rangle = 0$ for all $v$, then $A = 0$. Letting $v$ run across all eigenvectors of $A$, the hypothesis implies that $A$ has all eigenvalues $0$, hence is nilpotent. Suppose by contradiction that $A \neq 0$, hence there exists a vector $v_0$ such that $v_1 = A v_0$ is nonzero. We may assume WLOG that $A v_1 = 0$, hence

$$\langle v_0 + v_1, A(v_0 + v_1) \rangle = \langle v_1, v_1 \rangle \neq 0$$

which is a contradiction. Hence $A = 0$.

Note that the corresponding assertion for real inner product spaces is false; the condition $\langle v, Av \rangle = 0$ is satisfied by all skew-symmetric matrices.

Qiaochu Yuan
  • 419,620