1

I feel like I've got the answer, but I've never been good at putting what I think into words.

$\begin{vmatrix} n_{11} & n_{12} \\ n_{21} & n_{22} \end{vmatrix} = 0 = n_{11}n_{22} - n_{12}n_{21}$

$\begin{vmatrix} n_{11} + 1 & n_{12} \\ n_{21} & n_{22} + 1 \end{vmatrix} = n_{11}n_{22} + n_{11} + n_{22} + 1 - n_{12}n_{21} = \begin{vmatrix} n_{11} & n_{12} \\ n_{21} & n_{22} \end{vmatrix} + n_{11} + n_{22} + 1 = n_{11} + n_{22} + 1$

Which shows that it's true for a 2*2 matrix.
Looking at other 2*2 matrices

$\begin{vmatrix} n_{11} + 1 & n_{12} \\ n_{21} & n_{22} \end{vmatrix} = n_{11}n_{22} + n_{22} - n_{12}n_{21} = \begin{vmatrix} n_{11} & n_{12} \\ n_{21} & n_{22} \end{vmatrix} + n_{22} = n_{22}$

Similarly:
$\begin{vmatrix} n_{11} & n_{12} \\ n_{21} & n_{22} + 1 \end{vmatrix} = n_{11}$

$\begin{vmatrix} n_{11} & n_{12} + 1 \\ n_{21} & n_{22} \end{vmatrix} = -n_{21}$

$\begin{vmatrix} n_{11} & n_{12} \\ n_{21} + 1 & n_{22} \end{vmatrix} = -n_{12}$

Positive if on the main diagonal, negative if on the side diagonal.
Using that, we can show that
$\begin{vmatrix} n_{11} + 1 & n_{12} & n_{13} \\ n_{21} & n_{22} + 1 & n_{23} \\ n_{31} & n_{32} & n_{33} + 1 \end{vmatrix} = (n_{11} + 1)(n_{22} + n_{33} + 1) - n_{12}n_{21} - n_{13}n_{31} = n_{11}n_{22} - n_{12}n_{21} + n_{11}n_{33} - n_{13}n_{31} + n_{11} + n_{22} + n_{33} + 1 = n_{11} + n_{22} + n_{33} + 1$

And in a similar fashion we can apply this to higher order matrices.
But I haven't the slightest clue how to word or show this "similar fashion".

Torn
  • 235

3 Answers3

2

Since $\texttt{rank}(A) = 1$ $A = u v^T$ for some $n \times 1$ matrices $u$ and $v$ and consequently $\texttt{trace}(A) = v^Tu.$

We have $$\begin{pmatrix} I & 0 \\ v^T & 1 \end{pmatrix} \begin{pmatrix}I & u \\ -v^T & 1 \end{pmatrix} = \begin{pmatrix} I & u \\ 0 & 1 + v^Tu \end{pmatrix}. \tag{1}$$ And we also have $$\begin{pmatrix} I & -u \\ 0 & 1 \end{pmatrix} \begin{pmatrix} I & u \\ -v^T & 1 \end{pmatrix} = \begin{pmatrix} I + uv^T & 0 \\ -v^T & 1 \end{pmatrix}. \tag{2}$$

Taking determinant of $(1)$ and $(2)$ the result follows.

1

The value of a determinant as well of a trace is independent of the choice of basis. So suppose that the image of $A$ is generated by a vector $v_1$. Complement this vector with $v_2,...,v_n$ to form a base. In this base the matrix of $A$ takes the form: $$ \underline{A} = \left( \begin{matrix} a_{11} & a_{12} & ... & a_{1n} \\ 0 & 0 & ... & 0 \\ \\ . & . & ... & . \\ \\ 0 & 0 & ... & 0 \end{matrix} \right) $$ and that of $E+A$: $$ \underline{1+A} = \left( \begin{matrix} 1+a_{11} & a_{12} & ... & a_{1n} \\ 0 & 1 & ... & 0 \\ \\ . & . & ... & . \\ \\ 0 & 0 & ... & 1 \end{matrix} \right) $$ Then clearly ${\rm tr} \ A=a_{11}\ $ and $\ \det (1+A)=1+a_{11}$.

In less abstract terms, as $A$ has rank 1, we may write $A= u v^T$ where $u$ and $v$ are column vectors. Suppose that $e_1^T u\neq 0$. Then $u,e_2,...,e_n$ forms a basis. Carrying out the products one verifies: $$ (1+ A) \left[ \begin{matrix} u & e_2 & ... & e_n \end{matrix} \right] = \left[ \begin{matrix} u & e_2 & ... & e_n \end{matrix} \right] \left[ \begin{matrix} 1+v^T u & v_2 & ... & v_n\\ 0 & 1 & ... & 0 \\ . & . & ... & . \\ 0 & 0 & ... & 1 \end{matrix} \right]$$ so $1+A$ is conjugated to the matrix on the right which verifies the claimed identity. And determinant and trace is invariant under conjugation.

H. H. Rugh
  • 35,236
  • I think my main problem is that I don't understand how the trace is equal to $a_{11}$ after the conversions. I understand that the original determinant can have all elements but 1 turned to 0 by elementary operations because the rank is 1, but why is the new $a_{11}$ equal to the original trace, and why can I also eliminate all the other elements from the A + E matrix? – Torn Sep 12 '16 at 19:53
  • The proof is abstract viewing $A$ as a linear map of $n$-dimensional space, not by performing matrix operations. Being of rank one means that the image of $A$ is one-dimensional, here in the direction of $v_1$. But supplementing to a base (if you know about this kind of operation?) $v_1,v_2,...,v_n$ means that the image of any $Av_k$ must be a constant times $v_1$. Now, only at this stage we write down the matrix of $A$ in the chosen basis. The constant $a_{1k}$ appearing in the above matrix comes from $A v_k = v_1 a_{1k}$. All other constants must vanish: $a_{jk}=0$, $j>1$. – H. H. Rugh Sep 12 '16 at 20:01
  • Yeah, sorry, I'm afraid that's above my level of comprehension. – Torn Sep 12 '16 at 20:13
0

Suppose $\;A=(a_{ij})\;$ with rank$\,A=1\;$ and assume that the only row in $\;A\;$ which is linearly independent (= non all-zeros row) is the first one, so that the other ones are scalar multiples of this first one. Say the $\;i\,-$ th row is $\;k_i\;$ times the first one

$$|A+E|=\begin{vmatrix}1+a_{11}&a_{12}&\ldots&a_{1n}\\a_{21}&1+a_{22}&\ldots&a_{2n}\\\ldots&\ldots&\ldots&\ldots\\a_{n1}&a_{n_2}&\ldots&1+a_{nn}\end{vmatrix}=$$

We now apply elementary operations on the above which do not change the determinant: from row $\;i\;$ substract $\;k_i\;$ times the first row:

$$=\begin{vmatrix}1+a_{11}&a_{12}&a_{13}&\ldots&a_{1n}\\-k_2&1&0&\ldots&0\\-k_3&0&1&0\ldots&0\\\ldots&\ldots&\ldots&\ldots&\ldots\\-k_n&0&0&\ldots&1\end{vmatrix}\stackrel{\text{develop 2nd row}}=k_2\begin{vmatrix}a_{12}&a_{13}&\ldots&a_{1(n-1)}&a_{1n}\\0&1&0&\ldots&0\\0&0&1&\ldots&0\\\ldots&\ldots&\ldots&\ldots&\ldots\\0&0&0&\ldots&1\end{vmatrix}+$$$${}$$

$$\begin{vmatrix}1+a_{11}&a_{13}&\ldots&a_{1(n-1)}&a_{1n}\\-k_3&1&0&\ldots&0\\\ldots&\ldots&\ldots&\ldots&\ldots\\-k_n&0&0&\ldots&1\end{vmatrix}\stackrel{\text{Inductively}}=$$$${}$$

$$=k_2a_{12}+k_3a_{13}+\ldots+k_{n-1}a_{1(n-1)}+k_na_{1n}+1+a_{11}=$$

$$=1+a_{11}+a_{22}+\ldots+a_{nn}=1+\text{Tr.}\,A$$

Try to fill in details. After some thought, if you still have doubts write back.

DonAntonio
  • 211,718
  • 17
  • 136
  • 287
  • I understand how you arrived at everything you did, and this seems like proof to me, but I don't understand how we can just assume that the only non-zero row is the first one. And since that's the basis of everything else in this explanation, the rest isn't all too useful on its own. – Torn Sep 12 '16 at 18:56
  • @Torn You can either argue: if not the first row, then take the $;k,-$ th row and do exactly the same when instead of reducing the determinant according to the first row you do it according to the $;k,-$ th one. Or you can also argue: if the first $;k-1;$ rows are all-zeros rows, then we can interchange the first and the $;k,-$ th row in $;|A+E|;$ and simply multiply the determinant by $;-1;$ ...You can also argue as in the beginning of H.H. Rugh's answer... – DonAntonio Sep 12 '16 at 19:03
  • Yeah, but how can we assume that any row is a multiple of another another row, 0 or not? – Torn Sep 12 '16 at 19:30
  • @Torn That's exactly what rank $,A=1;$ means: there's only one column or row linearly independent, and thus all the others must be a scalar multiple of this one. – DonAntonio Sep 12 '16 at 20:21
  • Hmm, that is troublesome. I've learned that the rank of a matrix shows the largest n*n submatrix whose determinant does not evaluate to 0. – Torn Sep 12 '16 at 20:31
  • Well...in your case the largest submatrix is $;1\times1;$ so you actually get the same. Anyway, for this kind of exercise I'd say that it'd be nice to enlarge your comprehension of the different terms. That's what a lot of learning mathematics is about and many times it must be done by the student alone – DonAntonio Sep 12 '16 at 20:33