5

I would like to find some information about the following propositions, and unfortunately I haven't been able to find any.

Let $a_1,\dots,a_n\in\mathbb{Z}$ with $\gcd(a_1,\dots,a_n)=1$. Then there exists a matrix $A\in M_{n\times n}(\mathbb{Z})$ with first row $(a_1,\dots, a_n)$ such that $\det A=1$.


Or in another case:

Let $F$ be a field and $f_1,\dots,f_n\in\mathbb{F}[x_1,\dots, x_r]$ with $\gcd(f_1,\dots,f_n)=1$. Then there exists a matrix $A\in M_{n\times n}(F[x_1,\dots,x_r])$ with first row $(f_1,\dots, f_n)$ such that $\det A=1$.

Does somebody know something about this problem?

Thanks.

Note: I found the problem stated here but I haven't found any more info. In the link, it says:

This fundamental question generated an enormous amount of mathematics (giving birth to some new fields) and was finally settled almost simultaneously by D. Quillen and A. A. Suslin, independently. Now, there are fairly elementary proofs of this which require only some knowledge of polynomials and a good background in linear algebra. This could be an excellent project for someone who wants to learn some important and interesting mathematics.

Alex Mathers
  • 18,509
  • Do you mean Det A= \pm 1? – Martín Vacas Vignolo Mar 19 '16 at 01:15
  • Nope. But it's equivalent. If we find a matrix with determinant -1 we can do a permutation of rows to get a matrix with determinant 1.

    And with a permutation of rows we can also go from a matrix with determinant 1 to a matrix with determinant -1. Except in the case of $n=2$ where we would have to make a permutation of columns instead.

    – LeviathanTheEsper Mar 19 '16 at 01:20
  • But the first row is (a1...,an) or any permutation? – Martín Vacas Vignolo Mar 19 '16 at 01:22
  • 1
    Hummm, My mistake. It would have to be exactly $(a_1,\cdots,a_n)$. But in the case of $n=2$, since $gcd(a_1,a_2)=1$ then there exist $\alpha_1,\alpha_2$ such that $a_1\alpha_1+a_2\alpha_2=1$, and this is the determinant of the matrix

    $$\begin{bmatrix}a_1& a_2\ -\alpha_2 & \alpha_1\end{bmatrix}$$

    In any other case we can interchange rows, without interchanging columns.

    – LeviathanTheEsper Mar 19 '16 at 01:29
  • added jpeg from the Lam book to my answer – Will Jagy Mar 19 '16 at 04:45

3 Answers3

3

Here is an inductive proof of the first proposition:

The proposition is trivially true for $n = 1$.

For $n = 2$, consider the matrix $$A = \begin{pmatrix} a_1 & a_2 \\ r & s \end{pmatrix}.$$ By Bézout's identity, there exist integers $r$ and $s$ such that $\det A = a_1s - a_2r = \gcd(a_1, a_2) = 1$. Hence, the proposition also holds for $n = 2$.

So suppose that the proposition holds for $n = m$. Let integers $a_1, a_2, \ldots, a_{m + 1}$ with $\gcd(a_1, a_2, \ldots, a_{m + 1}) = 1$ be given. Put $g = \gcd(a_1, a_2, \ldots, a_m)$ and $a_i = gb_i, i = 1, 2, \ldots, m$, so that $\gcd(b_1, b_2, \ldots, b_m) = 1$. Complete an $m \times m$ matrix $B$ with first row $b_1, b_2, \ldots, b_m$, integral elements, and determinant $1$. Multiply the first row of $B$ by $g$ and place that new matrix in the upper-left block of an $(m + 1) \times (m + 1)$ matrix $A$. In the last row of $A$, put $rb_1, rb_2, \ldots, rb_m, s$ where $r$ and $s$ are integers to be determined. In the last column of $A$, put $a_{m + 1}, 0, 0, \ldots, 0, 0, s$. So here is $A$: $$A = \begin{pmatrix} a_1 = g\color{teal}{b_1} & a_2 = g\color{teal}{b_2} & \cdots & a_m = g\color{teal}{b_m} & a_{m+1} \\ \color{teal}{b_{21}} & \color{teal}{b_{22}} & \cdots & \color{teal}{b_{2m}} & 0 \\ \vdots & \vdots & \ddots & \vdots & \vdots \\ \color{teal}{b_{m1}} & \color{teal}{b_{m2}} & \cdots & \color{teal}{b_{mm}} & 0 \\ r\color{teal}{b_1} & r\color{teal}{b_2} & \cdots & r\color{teal}{b_m} & s \end{pmatrix}.$$ The determinant of the upper-left $m \times m$ block is $g\det B = g \cdot 1 = g$. The lower-left $m \times m$ block is $B$ except that the first row has been moved to the bottom and multiplied by $r$, so the determinant of that block is $(-1)^{m - 1}r\det B = (-1)^{m - 1}r$. By expanding the determinant of $A$ along its last column, we have that $\det A = (-1)^{1 + m + 1}a_{m + 1}(-1)^{m - 1}r + (-1)^{m + 1 + m + 1}sg =gs - a_{m + 1}r$. Now $\gcd(g, a_{m + 1}) = \gcd(\gcd(a_1, a_2, \ldots, a_m), a_{m + 1}) = \gcd(a_1, a_2, \ldots, a_m, a_{m + 1}) = 1$, so there exist integers $r$ and $s$ such that $\det A = gs - a_{m + 1}r = \gcd(g, a_{m + 1}) = 1$. Hence, the proposition holds for $n = m + 1$ whenever it holds for $n = m$. The induction principle guarantees that it holds for every positive integer $n$.

Remarks:

user0
  • 3,247
2

This is on page 13 of Newman, Integral Matrices. The section is called Completion to a unimodular matrix. Your fact is Theorem II.1

In general, you can start with $k$ rows and complete with determinant $1,$ as long as certain determinants of submatrices have gcd $1.$ I will see if i can find that.

See also THEOREMS

And then Serre Quillen Suslin Vaserstein

In other words, all you need is Lang's Algebra. There is also a whole book on it by T.-Y. Lam.

enter image description here

Will Jagy
  • 139,541
  • Do you think you could comment on my parenthetical in my comment / answer below? I got kind of confused. – Elle Najt Mar 19 '16 at 01:23
  • The article seems interesting. I get a little confused with terms like "The determinant divisor of $\alpha$" where $\alpha$ is some $m\times n$ matrix, but otherwise I think that's the solution to the problem, at least in some more particular case (Which covers my first case), when the ring is principal (In the second case we only know it's noetherian).

    Unfortunately I don't have any access the the "Newman, Integral Matrices" book.

    – LeviathanTheEsper Mar 19 '16 at 01:41
  • @LeviathanTheEsper, see https://en.wikipedia.org/wiki/Quillen%E2%80%93Suslin_theorem – Will Jagy Mar 19 '16 at 04:13
1

This is more of an extended comment than a real answer:

You could probably use the characterization that gcd of $(f_1, \ldots, f_n)$ = 1 iff there is a $F[x_1, \ldots, x_r]$ linear combination of the $f_i$ that makes $1$ - i.e. some polynomails $g_i$ so that you can write $\Sigma g_i f_i = 1$. (Same for $\mathbb{Z}$, where this is called Bezout's theorem.)

Then presumably some playing around with the determinant will let you set it up so that the minors of the matrix you get by removing the first column have the values you want.

(This is asserting the existence of an $n-1$ plane with specific plucker coordinates. Generally there would be some relations between the minors, the Plucker relations, but the Grassmannian of $n-1$ planes is a $P^n$, so there shouldn't be any in this case. I'm feel I am cheating a little bit here, because we are not working over a field, or anyway I don't feel confident in this intuition.)

See also:

https://en.wikipedia.org/wiki/Hilbert's_Nullstellensatz

https://en.wikipedia.org/wiki/B%C3%A9zout's_identity#For_polynomials

Elle Najt
  • 20,740
  • Yes. That is the first thing I thought for both the integer and the polynomial ring cases.

    But more than the case $n=2$ I haven't been able to solve it, neither by induction or using the general case (Since changing the values of one of the columns will change the value of almost all the minors).

    – LeviathanTheEsper Mar 19 '16 at 01:33
  • Area, there is a simplified proof in Lang's Algebra; also a book on it by Lam. The conjecture goes back to Serre. https://en.wikipedia.org/wiki/Quillen%E2%80%93Suslin_theorem – Will Jagy Mar 19 '16 at 04:17
  • added jpeg from the Lam book to my answer – Will Jagy Mar 19 '16 at 04:45
  • @WillJagy Wow, thanks. The problem looks so innocent. – Elle Najt Mar 19 '16 at 04:48