2

Given $a,b\in R$ evaluate $\det(aI_n+bJ_n)$, where $J_n\in M_n(R)$ has every entry equal to $1_R$.

I'm studying algebra, the book I use leaves this exercise, but I'm not able to solve it, with induction. $\mathbb{1}_{R}: R \to R$ defined by $\mathbb{1}_{R}(x) = b$
I have that doubt, who can answer it please.

1 Answers1

2

Start by finding the eigenvalues and multiplicities of $J_n$. If you compute eigenvalues for a few of these matrices (Wolfram Alpha might help you here), it should become clear that $0$ is an eigenvalue with multiplicity $n - 1$ and $n$ is an eigenvalue with multiplicity $1$.

To show $J_n$ has $0$ as an eigenvalue with multiplicity $n - 1$, I suggest computing the rank of $J_n$. Since every column (or row) vector is the same, the span of the columns is of dimension $1$. By the rank-nullity theorem, the nullspace of $J_n$, which is the dimension of the eigenspace corresponding to $0$, is going to be $n - 1$.

For the eigenvalue of $n$, I suggest computing the eigenvector corresponding to this eigenvalue. This should reveal that the eigenvector ought to be a multiple of $(1, 1, \ldots, 1)^T$. And, indeed, multiplying this vector to $J_n$ clearly produces $n$ times this vector, proving $n$ is indeed an eigenvalue for $J_n$.

Since we have found multiplicities of eigenvalues totaling $n$, there can be no others.

Now, $bJ_n$ has eigenvalues with the same multiplicities, but now multiplied by $b$. If we add $aI_n$ to a matrix, this adds $a$ to each eigenvalue. So, the eigenvalues of $aI_n + bJ_n$ are $a$ with multiplicity $n - 1$ and $a + bn$ with multiplicity $1$. This produces a determinant $$a^{n-1}(a + bn).$$

Theo Bendit
  • 50,900