1

Since this question is strongly connected to the decomposition of modules over algebras, I expect there is some solution. Rather, I am looking for a solution that does not involve all the machinery module decompositions involve, if it exists.

I have a field $k$ – assume finite if it helps – and a finite dimensional $k$-algebra $A \subseteq k^{n\times n}$ of which I know a basis $a_1,\dotsc,a_m$. What is the easiest way to decide if $A$ is local?

By brute force, for a finite field $k$, also $A$ is finite, so I can enumerate all elements $a$ from $A$ and check if $a$ or $1-a$ are invertible. I'd be grateful, however, for a solution that is less brute force, and, ideally, does not impose finiteness on the field.

Edit: by solving a bunch of linear equations, I can also find the structure constants $c_{ijk}$ such that $a_ia_j = \sum_k c_{ijk}a_k$, so these can be used as well.

Bubaya
  • 2,204
  • 9
  • 20
  • I think deciding whether or not a finite dimensional algebra is local using a basis and not using information about its radical might be very challenging. There are a lot of things that help if you get lucky with the basis, but they don’t work for all bases. – rschwieb Sep 18 '20 at 10:55
  • 1
    @rschwieb True, but if you can express the products $a_i\cdot a_j$ in the basis of the $a_k$ (which I believe is implicit in the question), then a basis of the radical can be obtained by solving a system of linear equations defined by using Dickson's criterion. From there, maybe one can say something about locality? – Pierre-Guy Plamondon Sep 18 '20 at 11:13
  • I think it can be done; writing an answer soon. – Pierre-Guy Plamondon Sep 18 '20 at 11:57
  • @Pierre-GuyPlamondon hen, I guess I’d have to see it. The thing that discouraged me is that your basis could be all units, even if the ring is local. – rschwieb Sep 18 '20 at 12:23
  • That should read “hmm” but it apparently got corrected to “hen” – rschwieb Sep 18 '20 at 12:46
  • 1
    @Pierre-GuyPlamondon Totally forgot about that my question – I should revisit the criterion. – Bubaya Sep 18 '20 at 13:41
  • @rschwieb Well, that was trickier than expected... and I can only do it under some hypotheses (which include finite fields). I feel it should be improvable. – Pierre-Guy Plamondon Sep 18 '20 at 17:44
  • @Pierre-GuyPlamondon THis is why I said "it's tricky without information about the radical." If one can spot the jacobson radical easily, the rest should be easy. – rschwieb Sep 18 '20 at 18:03
  • @rschwieb I don't understand you. Getting the Jacobson radical is easy, it amounts to solving a system of linear equation. Proving that $A/J$ is a division ring seems to me to be the difficult part? – Pierre-Guy Plamondon Sep 18 '20 at 18:04
  • @Pierre-GuyPlamondon I'm not disagreeing with you, I'm just saying that it feels like it's hard to do anything without going through the radical (which I thought might not be in the spirit of the question.) – rschwieb Sep 18 '20 at 18:12
  • 1
    @rschwieb Ah, I understand. In that case, I completely agree: I don't see how to do it without trying to find the radical. – Pierre-Guy Plamondon Sep 18 '20 at 18:14

1 Answers1

1

The following method should work. The third step is the only one that assumes that $k$ is a finite field; the first two steps only assume that elementary operations in $k$ can be done algorithmically. I have a feeling that one can do better than that, however.

First step: compute a basis of the radical.

This is done by solving a finite system of linear equations given by Dickson's criterion:

An element $r\in A$ is in the Jacobson radical of $A$ if and only if the trace of $\mu_{rs}$ is zero for all $s\in A$, where $\mu_{rs}:A \to A$ is left multiplication by $rs$.

Thus, an element $r$ is in the radical is and only if the trace of $\mu_{ra_i}$ is zero for all $i\in \{1, \ldots, m\}$. If we write $r = \sum_{i=1}^m \lambda_i a_i$, then we want $0=\mu_{ra_j} = \sum_{i=1}^m \lambda_i\mu_{a_ia_j}$ for all $j$. Since the matrix of $\mu_{a_ia_j}$ is easy to compute in terms of the structure constants $c_{i,j,k}$, this gives a system of $m^3$ linear equations with $m$ unknowns $\lambda_1, \ldots, \lambda_m$. You can then algorithmically find a basis $(r_1, \ldots, r_\ell)$ of the set of solutions of this system of linear equations using the method of your choice; this gives a basis of the radical.

Second step: find the structure constants of the quotient of $A$ by its radical.

Complete $(r_1, \ldots, r_\ell)$ into a basis of $A$ (you can do this algorithmically by looking at $(r_1, \ldots, r_\ell, a_1, \ldots, a_m)$ and finding the leftmost $m$ elements that are linearly independent). Let's say the basis you find is $(r_1, \ldots, r_\ell, r_{\ell+1}, \ldots, r_m)$. Let $\pi:A \to A/J$ be the projection of $A$ to its quotient by its radical. Then $(r_{\ell+1}, \ldots, r_m)$ projects via $\pi$ to a basis of $A/J$.

The matrix of the multiplication by $\pi(r_i)$ in $A$ is obtained by taking the matrix of $\mu_{r_i}$ in $A$, and keeping only rows and columns $\ell+1, \ldots, m$. This defines $A/J$ in terms of the basis elements $\bar r_1 = \pi(r_{\ell+1}), \ldots, \bar r_{m'} = \pi(r_m)$.

Third step: check that $A/J$ is a division algebra.

This is trickier. From now on, let's assume that $k$ is a finite field. Since finite division rings are commutative, the first thing we can do is check that $A/J$ is commutative, simply by checking that all products $\bar r_i \cdot \bar r_j$ commute. If $A/J$ is not commutative, then $A$ is not local.

Assume, then, that $A/J$ is commutative. Let us find its identity element. Consider $\bar r_1$. If $\mu_{\bar r_1}$ is not invertible, then $A/J$ is not a field, so $A$ is not local. If $\mu_{\bar r_1}$ is invertible, then its inverse $\mu_{\bar r_1}^{-1} = \mu_{(\bar r_1)^{-1}}$ is easy to compute in matrix form; applying it to $r_1$, we get an expression of $1$ in terms of the $\bar r_1, \ldots, \bar r_{m'}$.

Thus we can assume, without loss of generality, that $\bar r_1 = 1$. The minimal polynomial of $x_2$ over $k$ is then computable by finding the smallest $d$ such that $1,x_2,x_2^2, \ldots, x_2^d$ are linearly dependent, and by finding a linear dependency for them. Let $P$ be the minimal polynomial. Note that the $k$-span of $1,x_2,x_2^2, \ldots, x_2^{d-1}$ is a sub-$k$-algebra of $A/J$ isomorphic to $k[x]/(P)$.

Over finite fields, algorithms exist to check irreducibility of a polynomial. If $P$ is not irreducible, then $k[x]/(P)$ contains zero-divisors, and so $A/J$ is not a field.

Assume that $P$ is irreducible. Then $L=k[x]/(P)$ is a field contained in $A/J$ as a subalgebra. In particular, $A/J$ is an $L$-algebra. If $A/J = L$, then it is a field, and we conclude that $A$ is local. Otherwise, find an $L$ basis of $A/J$ (it can be constructed from $(\bar r_1, \ldots, \bar r_{m'})$), and repeat the above argument. Since the dimension of $A/J$ over $L$ is strictly smaller than that over $k$, the process will stop at some point.


This may seem like it is little improvement from a brute force check. The reason I think it is better is that the third step works as long as

  1. $A/J$ is commutative, and
  2. there exist algorithms to check the irreducibility of polynomials over $k$. This is true for finite fields, for $\mathbb{Q}$, for number fields, ...
  • It's very simple to check if a semisimple Artinian ring is local: you just check for nontrivial idempotents. I think that's even simpler than checking for commutativity, and probably doesn't require the extra restrictions (in step 3) – rschwieb Sep 18 '20 at 18:01
  • @rschwieb How do you check for idempotents? Don't you have quadratic polynomials in several variables to solve? – Pierre-Guy Plamondon Sep 18 '20 at 18:02
  • We're already solving LOTS of linear equations so I wasn't sure it would add much work :) – rschwieb Sep 18 '20 at 18:11
  • @rschwieb Yes, but that's easy. Solving quadratic polynomials in several variables over, say, $\mathbb{Q}$, doesn't seem much easier. Unless there's an easy algorithm I overlooked? – Pierre-Guy Plamondon Sep 18 '20 at 18:13
  • I don't know, it seems like you'd figure out very rapidly your candidate was not $0$ or was not $1$, possibly without computing all the coefficients. If you're solving it and get one nonzero coefficient, you just have to show it's not $1$. – rschwieb Sep 18 '20 at 18:14
  • My problem is precisely that I don't see how to solve the equation. – Pierre-Guy Plamondon Sep 18 '20 at 18:17
  • 1
    @rschwieb It seems easier to me than to compute idempotents: to be honest, I only know the works of Eberly and Rónyai on the topic of idempotents, but there seem to be some not-so-easy problems hidden in that. – Bubaya Sep 19 '20 at 18:23
  • 1
    Thank you. I think I needed that help to get to the third step. It's really dissatisfactory that all that stuff only works for finite fields, but that's not your fault :). – Bubaya Sep 19 '20 at 18:25