Let $S$ be a self-adjoint linear operator on an $n$-dimensional inner product space $V$. Then $$\sum_{i=1}^n\langle Su_i,u_i\rangle$$ is a constant for any orthonormal basis $\{u_1,\ldots,u_n\}$ of $V$. In fact, in terms of matrices, one easily sees that $$\sum_iu_i^\ast Su_i=\operatorname{tr}(\sum_iSu_iu_i^\ast)=\operatorname{tr}(SUU^\ast)=\operatorname{tr}(S).$$ However, is there any coordinate-free proof?
-
1You could use the coordinate-free definition – Ben Grossmann Jan 22 '20 at 15:47
1 Answers
One nice approach is as follows.
Claim: If $S$ has rank $1$, then the sum will come out to $\operatorname{tr}(S)$ for any choice of orthonormal basis.
Proof: Note that any rank-1 operator can be written in the form $S(x) = \langle x,v \rangle w$ for some vectors $v,w$ (in particular: $v$ spans the image of $S$ and $w$ spans the image of $S^*$). With that, we note that for any orthonormal basis $u_i$, we have $$ \sum_{i=1}^n \langle Su_i,u_i \rangle = \sum_{i=1}^n \langle \langle u_i,v \rangle w,u_i \rangle = \sum_{i=1}^n \langle u_i,v \rangle \cdot \langle w,u_i \rangle. $$ We recognize the above as a dot-product. That is, we note that $$ \langle w,v \rangle = \langle \sum_{i=1}^n \langle w,u_i\rangle u_i , \sum_{j=1}^n \langle v,u_j\rangle u_j\rangle = \sum_{i,j = 1}^n \langle u_j,v \rangle \cdot \langle w,u_i \rangle \cdot \langle u_i,u_j\rangle = \sum_{i=1}^n \langle u_i,v \rangle \cdot \langle w,u_i \rangle. $$ Because the sum is equal to $\langle w,v \rangle$ (which are inherent to $S$), we see that it does not depend on which orthonormal basis is chosen. We see that this sum is the trace of $S$ by plugging in the standard basis $e_1,\dots,e_n$. $\qquad \square$
From there, it suffices no note that the trace is a linear function and that every $S$ can be written as a sum of rank-$1$ operators. It is clear that the trace (as defined by your sum) is a linear function. To see that every operator is a sum of rank-one operators, we could use the singular-value decomposition or polar decomposition.
If we don't want to use the existence of such decompositions, it suffices to note that for some orthonormal $u_1,\dots,u_n$ every operator $S$ can be written in the form $$ S(x) = \sum_{i,j = 1}^n \alpha_{ij}\langle x,u_i \rangle u_j $$ where $\alpha_{ij} = \langle Su_i,u_j\rangle$.

- 225,327