1

It is well known (and the proof is simple) that $\det(AB)=\det(A)\det(B)$ when $A,B\in M_n(F)$ where $F$ field. But what about for $A,B\in M_n(R)$ where R is any arb. commutative ring?

A field is distinguished from a com. ring by the presence of multiplicative inverses. I feel like this isn't a necessary condition for proving the determinant function is a homomorphism but I'm having a hell of a time trying to do so.

How would I go about proving the above?

  • https://proofwiki.org/wiki/Determinant_of_Matrix_Product has it for matrices over a commutative ring. – While I Am Feb 28 '21 at 18:00
  • 1
    What is the proof you know ? (the answer is yes, by the way, but I'm not sure what "simple" proof doesn't work in general) – Maxime Ramzi Feb 28 '21 at 18:00
  • @MaximeRamzi the proof I know involves decomposing either matrix A or matrix B into a product of elementary matrices. Then we can rely on a lemma (which is true for matrices with entries in com. ring, not necessarily fields) whereby det(AE)= det(A)det(E) for elementary matrix E. If $B=E_1E_2...E_k$ then det(AB)=det(A)det(B) pops out immediately as a result. But writing an arbitrary matrix B with entries in com. ring as a product of elementary matrices is proving harder than I thought. – TheMathBoi Feb 28 '21 at 18:04
  • @William Right, so, that's actually the proof I was considering. But how do we know that an invertible matrix with entries in com. ring can be written as a product of elementary matrices? – TheMathBoi Feb 28 '21 at 18:05
  • 4
    The formula $\det(AB) = \det(A) \det(B)$ (for $n \times n$ matrices with any particular $n$) is an identity of polynomials over $\mathbb Z$ in the entries of $A$ and $B$, and therefore is true in any commutative ring. – Robert Israel Feb 28 '21 at 18:17
  • See my answer at https://math.stackexchange.com/questions/2443302/uniqueness-of-characteristic-polynomial-of-linear-transformation-in-finite-field/2443345#2443345 – Ted Feb 28 '21 at 18:53

1 Answers1

1

Here's a proof based on the abstract definition of the determinant:

Suppose $R$ is a commutative ring and $M$ is an $R$-module. Define $\bigwedge^n M$ to be the quotient of $M^{\otimes_R n}$ by the sub-$R$-module generated by the pure tensors $x_1\otimes ... \otimes x_n$ such that there exist $i\neq j$ with $x_i = x_j$. The image of the pure tensor $x_1\otimes ... \otimes x_n$ in this quotient is usually written as $x_1\wedge ... \wedge x_n$.

Note that by expanding out $(x+y)\otimes (y+x)$, it follows that in this quotient we have relations analogous to $x\wedge y = - y\wedge x$, and more generally, $x_{\sigma(1)}\wedge ... \wedge x_{\sigma(n)} = \epsilon(\sigma) x_1\wedge ... \wedge x_n$, where $\epsilon(\sigma)$ is the signature of the permutation $\sigma$.

Then

Lemma 1: If $M=R^n$ is free of rank $n$, $\bigwedge^k R^n$ is free of rank $\binom{n}{k}$, a basis being given by the $e_{i_1}\wedge ... \wedge e_{i_k}, i_1 <...< i_k$ (the $e_i$'s being the canonical basis of $R^n$)

It's fairly straightforward to prove that they generate $\bigwedge^k R^n$, slightly more complicated to show that they are linearly independent, but they are.

In particular, with $k=n$, $\bigwedge^n R^n$ is free of rank $1$. In this case, one possible way to show that $e_1\wedge ... \wedge e_n$ is not zero is to show that $\det : (R^n)^n \to R$ actually factors through $(R^n)^n \to\bigwedge^n R^n$ (that's because $\det$ is $n$-linear and satisfies the condition that if two of the rows of the matrix are equal, then $\det = 0$) and has value $1$ on $(e_1,...,e_n)$. This is the only case we will need, so this is a possible proof if you already know what $\det$ is.

Note that this proof actually shows more : it shows that $\det : \bigwedge^n R^n\to R$ is an isomorphism.

Lemma 2: Any $R$-linear map $f:M\to N$ induces an $R$-linear map $\bigwedge^n f: \bigwedge^n M\to \bigwedge^n N$ given by $x_1\wedge ... \wedge x_n \mapsto f(x_1)\wedge ... \wedge f(x_n)$. In particular, $\bigwedge^n (f\circ g) = \bigwedge^nf \circ \bigwedge^n g$

This is the key point, and is not hard to show.

Proposition 1: Let $f: R^n\to R^n$ be an $R$-linear map. Then $\bigwedge^n f$ is an $R$-linear endomorphism of a free $R$-module of rank $1$, so it's a homotethy. The homotethy in question is given by $\det(f)$.

Proof: we showed that $\det : \bigwedge^n R^n \to R$ was an isomorphism of $R$-modules, with inverse given by $1\mapsto e_1\wedge ... \wedge e_n$. Therefore it suffices to show that $\det\circ \bigwedge^n f(e_1\wedge...\wedge e_n) = \det(f)$. But that's clear, because $\bigwedge^nf(e_1\wedge...\wedge e_n) = f(e_1)\wedge ...\wedge f(e_n)$ by definition, and $\det(f)$ is by definition $\det(f(e_1),...,f(e_n))$.

Note : this might seem a bit circular to people who define the determinant by this proposition. Here I'm assuming that the OP knows what the determinant is, and proving, based on that, that it's equivalent to this.

Corollary (Conclusion) : $\det(f\circ g) =\det(f)\det(g)$.

Proof: By Proposition 1, $\bigwedge^n (f\circ g)$ is given by multiplication by $\det(f\circ g)$. But by lemma 2, $\bigwedge^n (f\circ g) =\bigwedge^n f\circ \bigwedge^n g$ is given by the composition of multiplication by $\det(f)$, and the multiplication by $\det(g)$, so multiplication by $\det(f)\det(g)$. Therefore, because it is over a free module, $\det(f\circ g) = \det(f)\det(g)$.

Let me make another note: it seems like we did no hard work at all. This is only what it looks like, the hard work is in the definition of the determinant, essentially in showing that $\bigwedge^n R^n$ is in fact free of rank $1$. The proof I gave here was that we had an explicit isomorphism, given by the determinant, this presupposes we know that the determinant exists, is $n$-linear, and satisfies the condition on matrices with two equal rows.

Maxime Ramzi
  • 43,598
  • 3
  • 29
  • 104