0

While studying about the notion of gcd's in commutative rings, I came across a theorem which states that any two elements in a Euclidean Ring always have a gcd. While this is true, isn't this true for any arbitrary commutative ring with unit element say, R?

My definition of greatest common divisor is valid in case of a commutative ring only. It states that:

Let $R$ be a commutative ring and $a,b\in R.$ An element $d\neq 0\in R$ is said to be the greatest common divisor of $a$ amd $b$ if

$(i)$ $d|a$ and $d|b$ and

$(ii)$ if $\exists c\neq 0\in R$ such that $c|a$ and $c|b$ then we have, $c|d.$

For example, say, $a,b\in R$ where $R$ is now a commitative ring with unit element, then, we can surely say that $1|a,b$. If $\exists$ any $d$ satisfying $d|a,d|b$ such that for any $c|a,c|b$ we have $c|d$ then gcd of a and b is $d.$

If $1$ is the only element that divides both $a$ and $b$ then, gcd of $a$ and $b$ is $1,$ isn't it? Can someone please correct me if I am going wrong?

Bill Dubuque
  • 272,048

1 Answers1

1

You can define the concept of what a gcd will be; but the question is whether such a thing exists.

So, yes, one defines the gcd of any subset $X$ of a commutative ring with unity $R$ as an element $d\in R$ such that:

  1. $d\mid x$ for all $x\in X$; and
  2. If $c\mid x$ for all $x\in X$, then $c\mid d$.

If such an element exists, then it is a greatest common divisor for (the elements of) $X$, and it is unique up to the relation of being associates (two elements in an arbitrary commutative ring $R$ are associates if they divide each other).

However, what the statement is telling you is that in an arbitrary commutative ring $R$, it is possible to have two elements that do not have a greatest common divisor at all.

For instance, in $\mathbb{Z}[\sqrt{-3}]$, we have that $4$ and $2+2\sqrt{-3}$ have no greatest common divisor. For clearly $2$ divides both, and so does $1+\sqrt{-3}$ (as $(1+\sqrt{-3})(1-\sqrt{-3})=4$). So if there is a greatest common divisor, then it must be a multiple of both $2$ and $1+\sqrt{-3}$. But any multiple of $2$ that divides $4$ must be equal to $\pm 2$ or to $\pm 4$; the former are not multiples of $1+\sqrt{-3}$, and the latter do not divide $2+2\sqrt{-3}$. So $4$ and $2+2\sqrt{-3}$ do not have a greatest common divisor in $\mathbb{Z}[\sqrt{-3}]$.

Arturo Magidin
  • 398,050
  • Thank you for your answer! The portion which says, "the former are not multiples of $1+\sqrt{-3}$" looks confusing for by former if you mean both $\pm 2$ and $\pm 4$ then $1+\sqrt{-3}$ divides $\pm 4$. I think I might be missing something. – Thomas Finley Feb 16 '24 at 19:21
  • @ThomasFinley "the former" are $\pm 2$. Neither $2$ nor $-2$ are multiples of $1+\sqrt{-3}$; and neither $4$ not $-4$ divide $2+2\sqrt{-3}$. – Arturo Magidin Feb 16 '24 at 19:22
  • Ok, that solves the confusion. Thanks! +1 from me. – Thomas Finley Feb 17 '24 at 04:43