15

Let $A$ be a positive definite matrix, and let $A = U \Sigma V^*$ be its singular value decomposition (SVD). Show that $U=V$.

What I have done: $A$ is Hermitian, so $A$ is unitarily diagonalizable, say, $A=WDW^*$ where $D$ consists of the eigenvalues (decreasing order). Also $D=\Sigma$ since $A$ is positive definite. From $A^2=AA^*=UD^2U^*$, and similarly I have $A^2=UD^2U^*=VD^2V^*=WD^2W^*$ so the column vectors of $U,V,W$ corresponds to same eigenvalues of $A^2$. And I'm now stuck. How could I proceed?

Gobi
  • 7,458

2 Answers2

11

As you already figured out, $A = U D V^*$ gives $A^2 = UD^2 U^* = V D^2 V^*$. Now, since positive Hermitian matrices only have one positive Hermitian square root, $A = UDU^* = UDV^*$, and since $U$ and $D$ are invertible, $U^* = V^*$.

Now, we need to show that positive Hermitian matrices have only one positive Hermitian square root. Suppose that $A$ is Hermitian. Consider the eigenspaces of $A$. Suppose that $Av = \lambda v$. Then $A^2 v = \lambda^2 v$. Since the map $\lambda \rightarrow \lambda^2$ is one-to-one on positive reals, this shows that $A$ and $A^2$ have exactly the same eigenspaces, with an eigenvalue of $\lambda$ in $A$ corresponding to an eigenvalue of $\lambda^2$ in $A^2$. This is enough to uniquely characterize $A$, given $A^2$. (We are implicitly using the fact that the vector space is a direct sum of eigenspaces of $A$. This is true since $A$ is Hermitian.)

Peter Shor
  • 3,314
  • Your answer is much better than mine. +1 – user1551 Jun 05 '13 at 21:47
  • If $A$ has basis consisting of eigenvectors $v_i$, then $A^2$ has the same eigenvectors as basis. But conversely, if $A^2v_i=(\lambda_i)^2 v_i$, how can I show that $v_i$ is an eigenvector of $A$? I think you showed only one direction. – Gobi Jun 06 '13 at 00:28
  • 2
    Don't think about eigenvectors; think about eigenspaces. If $A$ has eigenspaces $S_1$, $S_2$, $\ldots$, $S_k$, with eigenvalues $\lambda_1$, $\lambda_2$, $\ldots$, $\lambda_k$, then $A^2$ has the same eigenspaces with eigenvalues $\lambda_1^2$, $\lambda_2^2$, $\ldots$, $\lambda_k^2$. Now, if you have two positive Hermitian matrices $A$ and $B$ with $A^2 =B^2$, they must both be diagonalizable, and have the same eigenspaces with the same eigenvalues. This means they are equal. They have the same eigenspaces because any eigenspace of $A$ (or $B$) is also an eigenspace of $A^2$. – Peter Shor Jun 06 '13 at 01:09
  • Thanks for great answers. This also helped me. http://math.stackexchange.com/questions/349721/square-root-of-positive-definite-matrix – Gobi Jun 06 '13 at 01:41
  • 2
    If you don't like eigenspaces, here's another proof. We know that $A^2 v = \lambda^2 v$. Now, consider the vector $w = Av - \lambda v$. We have $A w = A^2 v - \lambda A v = \lambda^2 v - \lambda A v = -\lambda w$. Since $A$ has only positive eigenvalues, we know $w = 0$. – Peter Shor Jun 06 '13 at 02:43
3

Not sure how you could proceed. However, without loss of generality, suppose $\Sigma=(\sigma_1 I_{r_1})\oplus\cdots\oplus(\sigma_k I_{r_k})$ where $\sigma_1,\ldots,\sigma_k$ are distinct. If $A=U\Sigma V^\ast$ is positive definite, then so is $U^\ast AU=\Sigma W$, where $W=V^\ast U$. In particular, $\Sigma W$ is Hermitian and $$ W^\ast\Sigma^2W = (\Sigma W)^\ast(\Sigma W) = (\Sigma W)(\Sigma W)^\ast = \Sigma^2. $$ Therefore $W$ must be a block diagonal matrix $W_1\oplus\cdots\oplus W_k$ such that $W_j$ and $I_{r_j}$ have the same size and $$ A=U\Sigma V^\ast = V(\sigma_1 W_1)\oplus\cdots\oplus(\sigma_k W_k)V^\ast.\tag{1} $$ Since $A$ is positive definite, its eigenvalues coincide with its singular values (and hence they are positive). Now you may argue from $(1)$ that $W=I$ and hence $U=V$.

user1551
  • 139,064
  • How did you know that $W$ must be a block diagonal matrix? I found it hard by some calculations and writing matrices, but is there easy understanding?
  • – Gobi Jun 05 '13 at 13:30
  • How can I conclude $W=I$ from (1)? It is neither eigenvalue decomposition nor SVD. I don't know what to do.
  • – Gobi Jun 05 '13 at 13:32
  • @Gobi 1. If $a\ne b$ and $\pmatrix{a&0\ 0&b}\pmatrix{x&y\ z&w}=\pmatrix{x&y\ z&w}\pmatrix{a&0\ 0&b}$, can you show that $y=z=0$? The block matrix case is similar. 2. With $W$ being unitary, what are the eigenvalues of $W_j$? – user1551 Jun 05 '13 at 13:50
  • Since $W$ is unitary, eigenvalues are of absolute value 1. Let $\lambda$ be an eigenvalue of $W$ and $x$ be corresponding eigenvector, denoting $x=\pmatrix{x_1\ \vdots \x_k}$ then one of $x_j$ will be nonzero so it is the eigenvector of $W_j$ with eigenvalue $\lambda$. So the eigenvalues of $W$ will be same to the eigenvalues of $W_1,\cdots,W_k$. But what should I do now? – Gobi Jun 05 '13 at 14:43
  • @Gobi The two sides of equation $(1)$ gives a similarity relation. So, the eigenvalues of $A$ are the eigenvalues of $\sigma_1W_1,\ldots,\sigma_k W_k$. – user1551 Jun 05 '13 at 15:12