6

Let $A$ be a commutative ring. I am trying to show that if $f(x_1,x_2,\ldots, x_r) \in A[x_1,x_2,\ldots, x_r]$ is a zero divisor then there exists $a$ in $A-\{0\}$ such that $af=0$ in $A[x_1,x_2,\ldots, x_r]$.

What I have tried so far as following.
I am using induction on $r$ (not fixing the ring). The base case is for $r=1$ which I did assuming the minimal degree of $g$ for which $fg=0$ holds. So my induction assumption is for $r \geq 2$ whenever $f$ is in $A[x_1,x_2,\ldots x_r]$, $n<r$ is a zero divisor there is $a$ in $A-\{0\}$ such that $af=0$. So for the final step I take $f(x_1,x_2,\ldots, x_r) \in A[x_1,x_2,\ldots, x_r]$ is a zero divisor. Now since $A[x_1,x_2,\ldots, x_r]=A[x_1][x_2,x_3,\ldots, x_r]$ by induction hypothesis there is $g$ in $A[x_1]-\{0\}$ such that $fg=0$.

I cannot proceed further. Am I correct so far? Please help me. Thank you.

Via
  • 425
  • 5
    15 days in the site: you should really make an effort to learn the easy instructions to properly write mathematics in this site. – Timbuc Sep 27 '14 at 10:38
  • I don't know why it is not comming..Sorry – Via Sep 27 '14 at 10:39
  • What I wrote as following .I am using induction on r (not fixing the ring).The base case is for r=1 which I did assuming the minimal degree of g for which fg=0 holds.So my induction assumption is for r>=2 whenever f is in A[x1,x2,....xn],n<r is a zero divisor there is a in A-{0} such that af=0.So for the final step I take f(x1,x2,....xr) in A[x1,x2,....xr] is a zero divisor.Now since A[x1,x2,.....xr]=A[x1][x2,x3,.....xr] by induction hypothesis there is g in A[x1]-{0} such that fg=0.I cannot proceed further.Am I correct so far?Please help me.Thank you. – Via Sep 27 '14 at 10:40
  • IT's not coming because you don't do what you're supposed to do. Wanting is not enough to have things written nicely here. For example, one thing is f(x1,..,xn) and another $;f(x_1,...,x_n);$ . The main difference is the use of dollar signs and instead of x1 use x_1. There are several simple rules for this. – Timbuc Sep 27 '14 at 10:40
  • Ok For the next time onwards I will try my best – Via Sep 27 '14 at 10:42
  • 1
    Well, you could begin by editing this post...Observe you can enter in "edit" in any question/answer in the site and learn how things are typed there to learn. – Timbuc Sep 27 '14 at 10:44
  • @Via You can start by taking the tour. It's a first guide. Also it would be best if you learn to use MathJaX. I think you will find it very usefull. – gebruiker Sep 27 '14 at 10:47
  • Thanks to all of you for nice guidance,forgive me for my ignorance – Via Sep 27 '14 at 10:50
  • 1
    While thinking about this problem, I found that $f$ is killed at least by some homoegeneous polynomial $\neq 0$. See http://math.stackexchange.com/questions/83121 – Martin Brandenburg Sep 27 '14 at 14:03
  • What is $A$? Just any ring? – nigel Sep 27 '14 at 15:24
  • A is commutative ring @ nigelvr – Via Sep 27 '14 at 16:27
  • Here are some observations: You know from the $n=1$ case that if $f \in A[x_1,\cdots,x_n]$ is a zerodivisor, it is killed by an alement in $A[x_1,\cdots,x_{n-1}]$. Now $f$ is contained in a finitely-generated (free) $A[x_1,\cdots,x_{n-1}]$-module $M$ (take the generators to be the powers of $x_n$ in $f$). Then multiplication by $g$ defines an endomorphism $M \to M$, and $f$ lies in the kernel. In particular, multiplication by $g$ is a diagonal action. Maybe Cayley-Hamilton can be applied? – Fredrik Meyer Sep 29 '14 at 18:59
  • @Via Do you really need an inductive solution? Otherwise simpler arguments can solve your problem. – user26857 Sep 30 '14 at 09:34
  • I want it first in my approach..whats your simpler argunents?please explain it – Via Sep 30 '14 at 09:53
  • 1
    Use the lexicographical order on monomials. (Btw, you could have asked for more details under my answer before starting a bounty.) – user26857 Sep 30 '14 at 14:11

1 Answers1

8

From your reasoning it follows that $g(x_1)$ annihilates the (non-zero) coefficients of $f$ in $A[x_1]$. All you have to do now is to prove the following extension of the case $r=1$:

Let $f_1,\dots,f_t\in A[X]$ be non-zero polynomials. If there is $g\in A[X]$, $g\ne 0$, such that $gf_1=\cdots=gf_t=0$, then there is $a\in A$, $a\ne 0$, such that $af_1=\cdots=af_t=0$.

We show by induction on $d=\max_{1\le i\le t}\deg f_i$ that there is an element $a$ as before which moreover belongs to the ideal generated by the coefficients of $g$.

If $d=0$ there is nothing to prove. For $d\ge 1$ let's write the polynomials $f_i(X)=\sum_{j=0}^{n_i}a_{ij}X^j$ with $n_i=\deg f_i$, $\max\{n_1,\dots,n_t\}=d$, and $g(x)=\sum_{j=0}^nb_jX^j$, where $n=\deg g$. Suppose that $n\ge 1$. Then $b_na_{in_i}=0$ for $i=1,\dots,t$. Now consider two cases:
(1) If $a_{1n_1}g=\cdots=a_{tn_t}g=0$, then set $f_i'=f_i-a_{in_i}X^{n_i}$. Since $gf_i'=0$ for $i=1,\dots,t$ and $\max_{1\le i\le t}\deg f_i'<d$ we can apply the induction hypothesis and find an element $a\ne 0$ in the ideal generated by the coefficients of $g$ such that $af_1'=\cdots=af_t'=0$. This leads to $af_1=\cdots=af_t=0$.
(2) If there is $1\le i\le t$ such that $a_{in_i}g\ne 0$, then $(a_{in_i}g)f_1=\cdots=(a_{in_i}g)f_t=0$ and $\deg (a_{in_i}g)<\deg g$. An induction argument on $\deg g$ shows that there is $a\ne 0$ in the ideal generated by the coefficients of $a_{in_i}g$ such that $af_1=\cdots=af_t=0$.

user26857
  • 52,094