2

Say I create an extension $K$ over a field $F$ obtained by adjoining an element $\alpha$, i.e. $K = F[\alpha]$ ($\alpha$ does not necessarily have to be the root of a polynomial with coefficients in $F$). In this process I might need to adjoin some powers of $\alpha$. Suppose I have a basis $B$ for $K$ and want to show that every element in $K$ has an inverse - that $K$ is indeed a field. If every element of $B$ has an inverse, is that enough to guarantee that every element in $K$ have an inverse, and is there a nice way to compute it? If not, is it true if the extension is finite-dimensional i.e. $|B|$ is finite?

An example of what I'm talking about: take (letting $\alpha = \sqrt[3]{2}$) $K = \mathbb{Q}(\alpha)$. I have my basis $B = \left\{1, \alpha, \alpha^2\right\}$, and clearly $\alpha^{-1} = \frac{\alpha^2}{2}$ and $\alpha^{2^{-1}} = \frac{\alpha}{2}$. Any element in $K$ is of form $a + b\alpha + c\alpha^2$ and has some inverse of the same form. If I were asked to prove that adding the two elements $\alpha, \alpha^2$ is enough to make $K$ into a field, i.e. we do not need to add more powers, by showing that everything with just the adjunction of those two elements is multiplicatively closed and has inverses, would the existence of inverses of $\alpha, \alpha^2$ guarantee the existence of inverses of general elements?

This is in trying to prove properties about the extension. This is indirectly homework, but this exact question is not so I would appreciate a full answer or a link to one.

  • The definition of $F(\alpha)$ is the smallest field containing $F$ and $\alpha$. – JSchlather Jan 29 '13 at 00:40
  • @JacobSchlather But if you want to characterize a field by its basis, which is what the homework question I am asking deals with, is there some criterion by which you can know that your basis at least induces the right field, other than (in the example I gave) just chugging through and manually finding an inverse for an arbitrary element? In an extension of degree four that would be very tedious. – Julien Clancy Jan 29 '13 at 00:42
  • @JulienClancy Jacob's comment is correct, in that $F(\alpha)$ is the usual notation for a field. What you are interested in is some algebra over $F$ obtained by adjoining an element $\alpha$, and that is usually denoted $F[\alpha]$. Can you edit your question to use that notation instead? – PatrickR Jan 29 '13 at 06:53

2 Answers2

2

Consider the extension $K$ of the field $F$, adjoining a root $\alpha$ of $x^2 - 1$. Clearly $1,\alpha$ is a basis, and both elements are invertible, but $\alpha-1$ is a zero-divisor, and thus not invertible.

PS I am assuming you only want $F$ to be a field.

  • What are $K$ and $F$ here? Are you not adjoining $\alpha = 1$, which is already in the field? – Julien Clancy Jan 29 '13 at 00:25
  • Now that you have rephrased your question, I see that I gave an answer to a different one. Let me explain, though. Given a field $F$ and any polynomial $f(x) \in F[x]$, you can construct the ring $K = F/(f(x))$ and regard it as an extension of $F$. In $K$ the class of $x$ will be a root of $f(x)$. [follows in next comment] – Andreas Caranti Jan 29 '13 at 07:18
  • [see previous comment] As noted above by @JacobSchlater, $K$ will be a field if and only if $f(x)$ is irreducible in $F[x]$. But to see it more concretely in the example I gave, identify $F$ with the ring of matrices ${ \begin{bmatrix} a & 0\0&b \end {bmatrix} : a \in F}$, and consider $\alpha = \begin{bmatrix} 0 & 1\1 & 0 \end {bmatrix}$. Then $\alpha^2 = 1 = \begin{bmatrix} 1 & 0\0&1 \end {bmatrix}$ is invertible, but $0 = \alpha^2 - 1 = (\alpha - 1)(\alpha + 1)$, so $\alpha - 1, \alpha + 1$ are not. – Andreas Caranti Jan 29 '13 at 07:18
  • Ah, I understand now. Thanks for explaining; this is an excellent example. Do you know if the same can be done over subfields of $\mathbb{C}$? – Julien Clancy Jan 29 '13 at 16:17
  • @JulienClancy, this works for any field $F$. – Andreas Caranti Jan 29 '13 at 16:27
1

I think you're looking at the problem the wrong way. Let $F$ be a field and let $\alpha$ be algebraic of degree $n$ over $F$. I claim that $F[\alpha]=F(\alpha)$ and that $1,\alpha,\dots,\alpha^{n-1}$ is a basis of $F(\alpha)$ over $F$. If $1,\dots,\alpha^{n-1}$ were linearly dependent over $F$ then $\alpha$ would satisfy a polynomial of degree less than $n$, so they are linearly independent. Let $p(t)$ be the minimal polynomial of $\alpha$ over $F$. Any element of $F[\alpha]$ is of the form $f(\alpha)$ for some polynomial $f \in F[t]$. We may divide $f(t)$ by $p(t)$ so that

$$f(t)=p(t)q(t)+r(t)$$

where $\deg r(t) <n$ evaluating at $\alpha$ we deduce that

$$f(\alpha)=p(\alpha)q(\alpha)+r(\alpha)=r(\alpha).$$

Since $r(t)$ is of degree less than $n$ it follows that $r(\alpha)$ can be expressed as a linear combination of $1,\alpha,\dots,\alpha^{n-1}$. Thereby $1,\dots,\alpha^{n-1}$ form a basis for $F[\alpha]$. Now to see that $F[\alpha]$ is indeed a field we have a map $\varphi: F[t] \rightarrow F[\alpha]$ generated by sending $\varphi(t)=\alpha$. By definition of the minimal polynomial $\ker \phi = \langle p(t) \rangle$. Since $\varphi$ is surjective we have by the first isomorphism theorem that $F[\alpha] \cong F[t]/\langle p(t) \rangle$. Finally $p(t)$ is irreducible so it generates a maximal ideal in $F[t]$ and we conclude that $F[t]/\langle p(t) \rangle$ is a field.

JSchlather
  • 15,427