Suppose that $R$ is a nonzero commutative ring with unit, and $R^n$ is the free module of dimension $n$ over $R$. Is it right that $n+1$ elements in $R^n$ must be linearly dependent?
-
2Yes. This is true. In fact, something more general is true. If $M$ is an $R$-module that can be generated by $n$ elements, then any $n+1$ elements are linearly dependent. This is question 2) in this MSE post. While this answers your question completely in affirmative (because $R^{n}$ can be generated by $n$ elements), I think there might be a simpler argument in the special case when $M=R^{n}$ is a free module, so I will not vote to close (as a duplicate) for now. – Prism Jul 12 '15 at 06:01
-
1If $R$ is a domain, you can include it in its quotient field and use that this fact is true for fields. – Prahlad Vaidyanathan Jul 12 '15 at 06:19
-
@PrahladVaidyanathan True. However, I have a question about the term "domain". On this Wikipedia page, mathematicians don't require a domain to be commutative, but we need the commutativity of multiplication of the ring $R$ to define its quotient field $Q(R)$. Is your "domain" commutative or not? – Sam Wong Dec 27 '22 at 07:00
1 Answers
Suppose that there are $n+1$ linearly independent elements $x_1, x_2, …, x_{n+1}$ in $M=R^{n}$. Consider the submodule $F$ generated by these $n+1$ elements. Then clearly $F\cong R^{n+1}$. Since $F$ is a submodule of $M=R^{n}$, it follows that there is an injective $R$-linear map $R^{n+1}\to R^{n}$. But this is impossible!
That there is no injective $R$-linear map from $R^{n+1}\to R^{n}$ follows from this (famous?) exercise in Atiyah & Macdonald's commutative algebra textbook (Exercise 2.11). See this Mathoverflow thread. Let me copy the statement of the exercise and a beautiful solution (given by Balasz Strenner) for convenience.
Proposition. If $A$ is a nonzero commutative ring with $1$, and there is an injective $A$-linear map $A^{m}\to A^{n}$, then $m\leq n$.
Proof. Assume by contradiction that there is an injective map $\phi: A^m \to A^n$ with $m>n$. The first idea is that we regard $A^n$ as a submodule of $A^m$, say the submodule generated by the first $n$ coordinates. Then, by the Cayley-Hamilton Theorem (Proposition 2.4 in Atiyah & Macdonald), $\phi$ satisfies some polynomial equation \begin{equation} \phi^k + a_{k-1} \phi^{k-1} + \cdots + a_1 \phi + a_0 = 0. \end{equation} Using the injectivity of $\phi$ it is easy to see that if this polynomial has the minimal possible degree, then $a_0 \ne 0$. But then, applying this polynomial of $\phi$ to $(0,\ldots,0, 1)$, the last coordinate will be $a_0$ which is a contradiction as it should be zero.
-
Thanks for your answer, I think I should read Atiyah's textbook for more details. – Xiang Yu Jul 12 '15 at 07:02
-
@XiangYu You are welcome! Yes, I'd highly recommend Atiyah & Macdonald's book to learn commutative algebra. And try doing most of the exercises. There are some gems in the exercises. For example, already in Chapter 1, one of the exercises outlines a remarkable proof for the existence of algebraic closure (due to Emil Artin). While this particular exercise is not really about commutative algebra, I think such exercises are very enriching. – Prism Jul 12 '15 at 07:09
-
http://math.stackexchange.com/questions/574653/infinite-linear-independent-family-in-a-finitely-generated-a-module/575019#575019 – user26857 Jul 12 '15 at 11:00
-
@user26857: Indeed. I had posted the link in the comments to the question already. Orzech's result is pretty powerful – Prism Jul 12 '15 at 23:18