2

Leting $$A=\left[\begin{array}{rrrr}-1&1&1&1\\1&-1&1&1\\1&1&-1&1\\1&1&1&-1\end{array}\right],$$ find SVD, i.e., $A=U\Sigma V^T$ with $U^TU=UU^T=I$ and $V^TV=VV^T=I$, where $I$ is identity matrix.

Then, since $$A^TA = AA^T=4I,$$ I could found eigenvalues are all $4$'s for $A^TA$ and $AA^T$, and $$\Sigma=\begin{bmatrix}2&0&0&0 \\ 0&2&0&0 \\ 0&0&2&0 \\ 0&0&0&2\end{bmatrix}.$$

The textbook is saying that $U$ and $V$ are eigenvector matrix for $AA^T$ and $A^TA$, respectively.

I tried to find $U$ and $V$, but $A^TA-\lambda I = 0$ because $\lambda=4$. That is, any vectors can be eigenvectors.

How can I find $U$ and $V$ in this case?


Actually, without any settled methods, I found $U$ and $V$ by intuition of product of matrices. $$U=I \quad\text{and}\quad V=\frac12 A.$$

However, I want to know how to find $U$ and $V$ for this case.

Danny_Kim
  • 3,423
  • When you've got repeated eigenvalues you end up with some freedom to select the eigenvectors that you use. However, you have to select your U and V columns to satisfy the relationship between U and V through A. – Brian Borchers Nov 27 '17 at 05:06

2 Answers2

1

Since $A^TA = 4I$ you can pick any set of orthonormal vectors $V$ then since $2UV^T = A$ we must have $U = {1 \over 2} AV$.

In particular, $V=I, U= {1 \over 2}A$ will do.

copper.hat
  • 172,524
0

Clearly, the choice of $U$ and $V$ is not unique, since we can permute the $\sigma_{i}$'s on the diagonal of $\Sigma,$ and give the corresponding permutations of the columns of $U$ and $V$, or we could multiply some subset of the columns of $U$ by $-1,$ and also do this to the corresponding columns of $V$. More interesting nonuniqueness arises when we have repeated eigenvalues for $AA^{T}$ and $A^{T}A.$

However, when the matrix $A$ is Hermitian (and this could be extended to the case when $A$ is normal), I like to find the SVD in the following way: we know that $A$ has an eigendecomposition (which may also be nonunique) $A=UDU^{T},$ where $UU^{T}=U^{T}U=I,$ since $A$ is real and has real eigenvalues. Now we may factor $D$ as $\Sigma S=S\Sigma,$ where $\Sigma$ has the absolute values of the eigenvalues on the diagonal, and $S$ has the signs $\pm 1$ of the eigenvalues on the diagonal. Then we have $A=U\Sigma (SU^{T}),$ or $A=(US)\Sigma U^{T},$ either of which is a singular value decomposition. This allows us to clearly see the relationship between the eigenvectors/values and singular vectors/values in a way that I find quite agreeable.

  • Thank you for giving a way that I do not know so far. I will practice your way using $S\Sigma$ or $\Sigma S$. – Danny_Kim Nov 27 '17 at 05:50