14

How can I find the eigenvalues of the sum of rank-one matrices $vv^T + ww^T$?

I know that the eigenvalues of each term is $v^Tv$ and $w^Tw$, respectively.

Mark
  • 195
  • 1
  • 5

3 Answers3

19

Let $A=vv^T+ww^T$, then $A$ has at most rank $2$. So we want to know the potentially non-zero eigenvalues.

We have $\operatorname{Tr}(A)=|v|^2+|w|^2$ and $\operatorname{Tr}(A^2)=|v|^4+|w|^4+2\langle v,w\rangle^2$, so these eigenvalues must satisfy $\lambda_1+\lambda_2=|v|^2+|w|^2$ and $\lambda_1\lambda_2=|v|^2|w|^2-\langle v,w\rangle^2$.

So $\lambda_1$ and $\lambda_2$ are the roots of $X^2-(|v|^2+|w|^2)X+|v|^2|w|^2-\langle v,w\rangle^2$.

The discriminant is $$\Delta=(|v|^2+|w|^2)^2+4\langle v,w\rangle^2-4|v|^2|w|^2=(|v|^2-|w|^2)^2+4\langle v,w\rangle^2>0$$ (or equal to $0$ if $|v|=|w|$ and $v$ and $w$ are orthogonal; in this case we have only one root).

So $$\lambda_1=\frac{|v|^2+|w|^2-\sqrt{(|v|^2-|w|^2)^2+4\langle v,w\rangle^2}}2$$ and $$\lambda_2=\frac{|v|^2+|w|^2+\sqrt{(|v|^2-|w|^2)^2+4\langle v,w\rangle^2}}2,$$ the other eigenvalues being $0$.

Davide Giraudo
  • 172,925
  • can you explain better the implication $Tr(A^2)=|v|^4+|w|^4+2(v,w)^2 \Rightarrow \lambda_1\lambda_2=|v|^2+|w|^2-(v,w)^2$?? I understand that $\lambda_1^2+\lambda_2^2=Tr(A^2)$... Is it a straightforward computation or am I forgotting some theorem? Thanks in advance S. –  Feb 13 '14 at 16:17
  • We can expand $A^2$ as $vv^Tvv^T+2vv^Tww^T+ww^Tww^T$ then compute the trace. Then $2\lambda_1\lambda_2=(\lambda_1+\lambda_2)^2-\lambda_1^2-\lambda_2^2$. – Davide Giraudo Feb 13 '14 at 19:23
  • This is excellent! – TenaliRaman Nov 14 '14 at 09:52
  • Nice answer. Any idea how to get leading eigenvector ? – dohmatob Oct 18 '19 at 11:11
  • @dohmatob I have not thought about that. Maybe you can ask it as a separate question. – Davide Giraudo Oct 19 '19 at 20:53
  • @DavideGiraudo It has been asked and answered here https://scicomp.stackexchange.com/a/33615/16807. – dohmatob Oct 19 '19 at 21:06
12

Let $u_1$ and $u_2$ be orthonormal vectors such that $v = a u_1$ and $w = b u_1 + c u_2$ for some scalars $a,b,c$. Thus in an orthonormal basis starting with $u_1, u_2$, the matrix of $v v^T + w w^T$ has first two rows and columns $\pmatrix{a^2 + b^2 & bc \cr bc & c^2\cr}$ and everything else $0$. The nonzero eigenvalues of your matrix are the eigenvalues of this.

Robert Israel
  • 448,999
  • I should have mentioned, $v,w$ could have any $n>1$ elements. – Mark Feb 22 '12 at 22:17
  • It doesn't matter, Robert's answer still applies. As he says, the "first two rows and columns" are the matrix mentioned. What is not completely clear to me (at least without writing it!) is how easy it is to find $u_2$ and $b,c$ in general. – Martin Argerami Feb 22 '12 at 23:26
  • 4
    To find $u_2$, use Gram-Schmidt. As for $b$ and $c$, they are $u_1^T w$ and $u_2^T w$ respectively. – Robert Israel Feb 23 '12 at 04:19
  • Nice answer. How would one (analytically) get the corresponding eigenvectors in the standard basis ? Thanks in advance. – dohmatob Oct 19 '19 at 08:08
3

You cannot expect to be able to express the eigenvalues of $vv^T + ww^T$ from the eigenvalues of each of the two summands.

For instance, given $t\in[0,1]$ let $$ v=\begin{bmatrix}1\\ 0\end{bmatrix}, \ \ w=\begin{bmatrix}t\\ \sqrt{1-t^2}\end{bmatrix}. $$ Then $vv^T$, $ww^T$ are rank-one projections, each with eigenvalues $\{0,1\} $ which "forget" about $t$. But $$ vv^T+ww^T=\begin{bmatrix}1+t^2&t\sqrt{1-t^2}\\ t\sqrt{1-t^2}&1-t^2\end{bmatrix}, $$ which has eigenvalues $1+t,1-t$.

Martin Argerami
  • 205,756