Let $x$ and $y$ denote two length-$n$ column vectors. Prove that $$\det(I + xy^T ) = 1 + x^Ty$$
Is Sylvester's determinant theorem an extension of the problem? Is the approach the same?
Let $x$ and $y$ denote two length-$n$ column vectors. Prove that $$\det(I + xy^T ) = 1 + x^Ty$$
Is Sylvester's determinant theorem an extension of the problem? Is the approach the same?
Hint: Decomposing $$ \begin{pmatrix} 1 & -y^T\\ x & I \end{pmatrix} $$ as lower $\cdot$ upper and upper $\cdot$ lower gives $$ \begin{pmatrix} 1 & 0\\ x & I\end{pmatrix} \cdot \begin{pmatrix} 1 & -y^T\\ 0 & I + xy^T \end{pmatrix} = \begin{pmatrix} 1 + x^Ty & -y^T\\ 0 & I\end{pmatrix} \cdot \begin{pmatrix} 1 & 0\\ x & I \end{pmatrix}. $$
You can also apply the property
$$\det\begin{pmatrix} A & B\\ C & D \end{pmatrix}=\det(A) \det(D-C A^{-1}B)=\det(D) \det(A-B D^{-1}C)$$
(valid when the inverses exist) to the matrix $$\begin{pmatrix} I & -y\\ x^T & 1 \end{pmatrix}$$
(BTW: here I'm assuming column vectors; the question speaks of "column vectors", but its notation would correspond to row vectors)
An argument without rigorous proof is:
(1) Note that $xy^T$ is a rank 1 matrix with eigenvalues zeros and $x^Ty$.
(2) The eigenvalues of $I+xy^T$ are ones and $1+x^Ty$.
(3) Since the product of eigenvalues produces the determinant, then
$$\det(I+xy^T)=1+x^Ty.$$
Here is an approach by upper triangularization for the sake of variety.
Note that $xy^T\in M_n(K)$ has rank $\leq 1$ and $\mbox{tr } xy^T=\sum_{j=1}^nx_jy_j=x^Ty$.
So $0$ is an eigenvalue of multiplicity $n-1$ if $xy^T$ has rank $1$, or $n$ if $xy^T=0$. In the latter case $x=0$ or $y=0$ whence $\det(I+xy^T)=\det I=1=1+x^Ty$. Now if $xy^T$ has rank $1$, take any vector not in $\ker xy^T$ and add it to a basis of $\ker xy^T$ to get a basis of $K^n$. Then, taking the trace to determine the lower-right coefficient, we see that $xy^T$ is similar to $$ S(xy^T)S^{-1}=\pmatrix{0&*\\0&x^Ty}\quad \Rightarrow\quad S(I+xy^T)S^{-1}=\pmatrix{I&*\\0&1+x^Ty} $$ The result follows immediately.
Note: every matrix of rank $1$ is of the form $xy^T$ with $x\neq 0$ and $y\neq 0$. With the approach above, we see that a square rank $1$ matrix is diagonalizable if and only if its trace is nonzero.
In this answer, it is proven that when $A$ is an $n\times m$ matrix and $B$ is an $m\times n$ matrix, where $n\ge m$, $$ \det(\lambda I_n-AB)=\lambda^{n-m}\det(\lambda I_m-BA) $$ For this problem, set $\lambda=1$, $A=-x$ and $B=y^T$.
Here is another answer. It is slightly longer but I find it more concrete. Let $\mathbf{x}=(x_1,\ldots,x_n)^T,\ \mathbf{y}=(y_1,\ldots,y_n)^T,\ \mathbf{u}=(x_2,\ldots,x_n)^T$ and $\mathbf{v}_k=(y_2,\ldots,y_n)^T$. By Laplace expansion along the first row, we get \begin{align*} \det(I+\mathbf{x}\mathbf{y}^T) =\det\pmatrix{1&0\\ \mathbf{u}y_1&I_{n-1}+\mathbf{u}\mathbf{v}^T} +\det\pmatrix{x_1y_1&x_1\mathbf{v}^T\\ \mathbf{u}y_1&I_{n-1}+\mathbf{u}\mathbf{v}^T}. \end{align*} The second determinant on the RHS is zero if $x_1=0$. If $x_1$ is nonzero, subtract $x_i/x_1$ times the first row from the $i$-th row for every $i\ge2$. Hence the second determinant is equal to \begin{align*} \det\pmatrix{x_1y_1&x_1\mathbf{v}^T\\ 0&I_{n-1}} = x_1y_1 \end{align*} and by recursion, $\det(I+\mathbf{x}\mathbf{y}^T)=x_1y_1+x_2y_2+\ldots+x_{n-1}y_{n-1}+(1+x_ny_n)$.
Let $A=xy^T$. This matrix has rank 1 thus $\lambda=0$ is an eigenvalue of multiplicity at least $n-1$. [one can prove this for example by observing that if $y \perp z$ then $Az=0$].
Let $w$ be the projection of $x$ on $y$. If $w=0$, we get $A=0$ and the problem is easy to prove.
Assume that $w \neq 0$. Then $w = \frac{x^Ty}{y^Ty}y$.
$$xy^T=wy^T+(x-w)y^T= wy^T \,.$$
Then
$$Aw=wy^Tw=\frac{x^Ty}{y^Ty}yy^Tw=(x^Ty)w$$
Thus $w$ is eigenvector and $(x^Ty)$ is the remaining eigenvalue. From here the result follows immediately.