2

Usually, we define a polynomial as

$a_n x^n + \cdots + a_1 x + a_0$

where $x$ is called indeterminate.

Would it be better to define it as

$a_n x^n + \cdots + a_1 x + a_0 x^0$

where $x^0$ means the identity element in the structure which $x$ belongs to.

For the study of the polynomial itself, I think these two definition make no difference. But when you treat a polynomial (expression) as a polynomial function, and if we use the second definition, we can just substitute all the $x$ with some variable (eg. some square matrix), rather than define it in a "adhoc" way, i.e. if we are using the first definition, we have to explicitly define the function to add an identity element to the constant term $a_0$ to make $a_1 x$ and $a_0$ addable.

Is the second definition equivalents to the first one?

If yes, then is it true that the authors of those texts actually means def 2 when the define the polynomial using def 1, they just omit the identity element?

If not, why? Would it be nicer to eliminate the non-addable $a_1x + a_0$ with $a_1x + a_0 x^0$, (even though we really don't need to add these two terms together when we are studying the knowledge of polynomial itself)?

Not an ID
  • 877
  • It really doesn't matter. Quite often, when written in sigma notation, we write it as $\sum_{i=0}^n a_ix^i$, even if we otherwise write $a+bx+cx^2$. I often use this notation for one of the reasons we define $0^0=1$ :) – Thomas Andrews Aug 07 '14 at 03:33
  • 1
    Please give an example of what you mean by the second requires "adding an identity... to make addable" – Bill Dubuque Aug 07 '14 at 03:34
  • 1
    Did you mean $x^0$ instead of $x_0$ in that last line? – Thomas Andrews Aug 07 '14 at 03:35
  • You can also define a polynomial as a sequence $(a_0,a_1,\dots)$ where only finitely many $a_i$ are non-zero. Then $(1,0,0,\dots)$ is not $x^0$, where $x=(0,1,0,0,\dots)$, but rather $(1,0,0,0,0,\dots)$ is just the identity. – Thomas Andrews Aug 07 '14 at 03:37
  • @BillDubuque Suppose that $x$ is a matrix, then $a_1 x + a_0$ is not a valid expression, but if you "add" the identity matrix $I$ to $a_0$, you get $a_0I$, which makes $a_1 x + a_0 I$ a valid expression. – Not an ID Aug 07 '14 at 04:09
  • Congratulations, you've independently invented modules! http://en.wikipedia.org/wiki/Module_(mathematics) – Pseudonym Aug 07 '14 at 04:14
  • @NotanID See this answer on $R$-algebras. – Bill Dubuque Aug 07 '14 at 04:27
  • Why is it harder to add $a_0+b_0$ than it is to add $a_0x^0+b_0x^0$. You still haven't said. – Thomas Andrews Aug 07 '14 at 04:58
  • It doesn't matter since as you said yourself multiplication by $x^0$ doesn't change the element being multiplied. Nevertheless it does make a subtle pattern more obvious, as in people can understand how that constant fits into the general scheme :) – Sidharth Ghoshal Aug 07 '14 at 05:00
  • @frogeyedpeas But the def 2 will allow substitution which will cause a problem. Please see my comment for Thomas Andrews's answer for the detail. – Not an ID Aug 07 '14 at 05:02
  • @NotanID It seems to me that the issue is the formal domain of the function isn't well defined. I'm taking a computer-scientist assumption that every function is assumed to come with a list of things that can be considered arguments for it, and that $x^0$ denotes the identity element of the groups of all possible arguments for which the polynomial can be evaluated. The type mismatch issue you helped answer earlier could've been dodged if the asker realized that the polynomial only acted on Matrices and $x^0$ was implied to be a matrix – Sidharth Ghoshal Aug 07 '14 at 05:08
  • @ThomasAndrews No, not $a_0 + b_0$, but $a_1x + a_0$. You will never want to or need to add $a_1x$ and $a_0$ together (to get a somewhat more simplified expression) when you treat a as an element of the polynomial ring, there is no problem here. But When you treat it as a function, and you plug a specific matrix, say $A$, into it, you get problem, since $a_1A + a_0$ is not a valid expression, but in the second def, you will get $a_1A + a_0I$, which is valid, sometimes whether or not I can perform the substitution is important (for example see the link I mentioned in your answer's comment). – Not an ID Aug 07 '14 at 05:10
  • $a_0+a_1x$ is $a_0+a_1x$. What is the difference? $x$ is an ideterminate. – Thomas Andrews Aug 07 '14 at 10:58

1 Answers1

2

It really doesn't matter which way you define polynomial. Another way is to consider all sequences:

$$(a_0,a_1,\dots,a_n,\dots)$$ where only finitely many $a_i$ are non-zero.

Then we add series point-wise, and we find their products by the Cauchy product.

Then $(1,0,0,0,\dots)$ is the multiplicative identity, and $(0,1,0,0,\dots)$ is $x$. So $1$ isn't really $x^0$, and $x^2$ just means $x\cdot x$.

Thomas Andrews
  • 177,126
  • For the study of the polynomial itself, it really doesn't matter. It is when you treat it as function that causes a problem, if you think def 2 is equivalent to def 1, then you will run into trouble with this formula $p(\lambda) = det(\lambda I - A)$, where $p$ is the characteristic polynomial of matrix $A$. because substitution is allowed, we can substitute the $\lambda$ in both sides with $A$, then you get "zero (that is the number 0) = ZERO (that is the zero matrix)". (to be continued) – Not an ID Aug 07 '14 at 04:27
  • ... Please see this question for detail. – Not an ID Aug 07 '14 at 04:28
  • 1
    @NotanID No, there is no problem when treating it as a function, as long as you accept that $0^0=1$. – Thomas Andrews Aug 07 '14 at 04:56