3

What is the connection between polynomial dimensions and geometric dimensions, and their respective representation as matrix transformations in linear algebra?

Because they use matrix notation both for decomposing a geometric vector into its components in different dimensions, and for decomposing a polynomial into its components.

Both of these collections of components are then represented by lists of numbers in $\mathbb{R}^n$, right? And each linearly independent entry in a "vector" in $\mathbb{R}^n$, say $(1, 2, 3, 4, 5)$, corresponds to a different dimension, right?

I get it perfectly well that in the geometric case, each of those entries represents the amount of each basis vector you are adding together to create some other vector.

But then they go and use the same system $(1, 2, 3, 4, 5)$, for decomposing polynomials. So again, I guess it depends on the polynomial you choose for each basis, and the vector $(1, 2, 3,4, 5)$ represents a scalar multiple of each basis polynomial, which you are then adding together to get the final polynomial "vector", right?

So if your basis is $\{1, x, x^2, x^3, x^4\}$, you would represent the polynomial $1 + x + x^2 + x^3 + x^4$ as $(1,1,1,1,1)$ with respect to that basis, right?

Now hear me out. If any of the basis vectors had $x$ raised to the same power, then they could be combined by simple addition and/or scalar multiplication (i.e., linear combination). Like you could combine $x$ and $2x$ to equal $3x$. This would make them linearly dependent because one is a multiple of the other or a combination of the others.

But in linear algebra, they do not allow this for $x^2$; i.e., you cannot say, "well, $x^2$ is just $x \cdot x$, so $x^2$ is a scalar multiple of $x$ and $x$, therefore a linear combination."

But this makes no sense, because $x$ is a number, too!

Linear independence means different dimensions, and they are saying that $x^2$ and $x$ live in different dimensions, just as the Cartesian basis vectors $(1,0)$ and $(0,1)$ live in different dimensions.

How on earth do they draw that equivalency? Makes absolutely no sense!

YuiTo Cheng
  • 4,705
Dude
  • 101
  • In this situation $x$ is not in the field of scalars, it is a vector. When you talk about scalar multiplication it refers specifically to elements in the field of scalars being multiplied to vectors. So, multiplying by $x$ is not scalar multiplication. – wgrenard Oct 14 '17 at 06:30
  • $x$ isn’t a number. It’s a formal symbol. When you evaluate a polynomial, you replace it with a number. So, as wgrenard says, $x$ is not an element of the scalar field of the vector space. – amd Oct 14 '17 at 07:07
  • Related, possibly helpful: https://math.stackexchange.com/questions/2185587/what-actually-is-a-polynomial/2185648#2185648 – Ethan Bolker Oct 14 '17 at 12:39

1 Answers1

0

Let me put the terms "geometric space", "polynomial", "basis", and "dimension" into context and it will be clear what is going on.

Geometric space & Polynomial. The common concept here is that of a vector space. Just forget for a second the geometric perspective on vectors and vector space and look on it abstractly. The ingredients for a vector space are:

  • A set $V$, the "set of vectors".
  • An operator $+ \colon V \times V \rightarrow V$ that lets you add up vectors.
  • A scalar mutliplicaton $\cdot : \mathbb{R} \times V \rightarrow V$ that lets you "scale" a vector $v \in V$ by a scalar $\lambda \in \mathbb{R}$ to $\lambda \cdot v$.

In order to have the triplet $(V, +, \cdot)$ form a vector space the operators $+$ and $\cdot$ need to satisfy some properties that allows you to do things like $(\lambda + \mu)(u + v) = \lambda u + \lambda v + \mu u + \mu v$ for $u, v \in V$ and $\lambda, \mu \in \mathbb{R}$.

You gave two examples for vector spaces:

  1. $(\mathbb{R}^n, +, \cdot)$, where $+$ denotes the addition of points (as position vectors) in $\mathbb{R}^n$ and $\cdot$ denotes the scalar multiplication of a real number $\lambda$ with a point (as position vector) in $\mathbb{R}^n$.

  2. $(P_n, +, \cdot)$, where $P$ denotes the set of polynomials of degree at most $n$, $+$ denotes the addition of polynomials and $\cdot$ denotes the scalar multiplication of a real number with a polynomial.

Actually, you can interpret "polynomial" in two different ways here: An, say, "abstract polynomial" that is simply a finite sequence of "coefficients" $(a_0, \dots, a_n)$, and there is no "$x$" here, and the polynomial as a function (called polynomial function) $\mathbb{R} \rightarrow \mathbb{R} \colon x \mapsto \sum_{i=0}^n a_i x^i$. In your setting, these two concepts are essentially the same, but maybe it helps to think of a polynomial as a function.

Basis & dimension. Say you found a finite set of vectors $v_1, \dots, v_m \in V$ that have the properties that (i) they "span" the whole vector space $V$ by considering all linear combinations $\sum_{i} \lambda_i v_i$ of these vectors and (ii) none of the $v_i$ is a linear combination of the others (c.f., linearly independent). Then this set of vectors $v_1, \dots, v_m$ is called a basis of $V$. The fun thing here is that any basis of $V$ has the same number of vectors, and this number is called the dimension. Note: "a basis" but "the dimension".

A basis of $(\mathbb{R}^n, +, \cdot)$ is for instance $e_1, \dots, e_n$ with $e_i \in V$ having all coordinates being zero, except the $i$-th, which is one. So $e_2 = (0, 1, 0, \dots, 0)$. The dimension of $(\mathbb{R}^n, +, \cdot)$ is therefore $n$; any basis has $n$ vectors.

A basis of $(P_n, +, \cdot)$ would be $f_0, \dots, f_n$ with $f_i \colon \mathbb{R} \rightarrow \mathbb{R} \colon x \mapsto x^i$. In your question you used the short-hand notation $x^i$ for $f_i$, which I would like to avoid, to make a point (sic!) here.

A nice thing about this particular basis is that the polynomial $(a_0, \dots, a_n)$ (as abstract polynomial) resp. $x \mapsto a_0 + a_1 \cdot x + a_2 \cdot x^2 + \cdots + a_n x^n$ (as the polynomial function) can be easily decomposed as a linear combination of basis vectors, namely $a_0 f_0 + \cdots + a_n f_n$. Let us be precise about the notation here: When I write $a_0 + a_1 \cdot x + \cdots + a_n \cdot x^n$ then the $+$ refers to addition of real numbers. When I write $a_0 f_0 + \cdots + a_n f_n$ then the $+$ refers to addition of vectors, namely the ones in the vector space $(P_n, +, \cdot)$. For the fun of it one could say: The abstract polynomial $(a_0, \dots, a_n)$ has as coordinate vector $(a_0, \dots, a_n)$ for the ordered basis $f_0, \dots, f_n$ of the vector space $P_n$.

As a side note: $P_n$ has dimension $n+1$, just like $\mathbb{R}^{n+1}$ has.

Conclusion. In terms of linear algebra, the vectors (!) $f_1$ and $f_2$, which are your $x$ and $x^2$ in the question, are linearly independent, because there is no scalar $\lambda \in \mathbb{R}$ such that $\lambda f_1 = f_2$. In other words you cannot scale the polynomial function $x^1$ in a way to get the polynomial function $x^2$, in your notation. (From calculus point of view: you cannot scale the function graph $x \mapsto x^1$ to get the one of $x \mapsto x^2$.)

Another argument, but maybe harder to follow, would be the following. Consider the coefficient sequence (the abstract polynomial) of the two polynomials $f_1, f_2$. They would be $(0, 1, 0, \dots)$ and $(0, 0, 1, 0, \dots)$. You cannot scale $(0, 1, 0, \dots)$ by scalar-multiplication with $\lambda$ to get a vector $(0, 0, 1, 0, \dots)$. But I do not like this argument, because these coefficient sequences already look like the vector notion in $\mathbb{R}^n$.

Anyhow, in this sense your two polynomials "live in different dimensions", even though I would not phrase it like that, because the term "dimension" is not used as particular linearly independent directions of a vector space in linear algebra.

By the way, when you argue that $x^2 = x \cdot x$ then you do not operate on vectors in a vector space, but you operate on the field of the real numbers, which allows you to multiply the field members. In a vector space there is no multiplication $u \cdot v \in V$. (Well, not quite true, see cross product, but this is not what you look for.)

S. Huber
  • 181
  • Thank you! "In a vector space there is no multiplication u⋅v∈Vu⋅v∈V" : this makes sense geometrically, as different dimensions never "cross paths" except at the origin, so multiplying/adding them would have no meaning. But is there some intuitive reason to "believe" this also holds true for polynomials? Is it because there's no "multiplying" functions together as they behave differently? I guess because both functions and vectors can be graphed on a cartesian grid, it feels like there's a deeper connection there, but everyone simply dismisses it and says, "they're just different objects." – Dude Oct 14 '17 at 17:55
  • Also, you said you cannot multiply polynomials, but as far as I know, outside the artificial restrictions of linear algebra, you most definitely can multiply them; just add the exponents, so x^2 and x^3 would give you x^5, right? So why the artificial restriction in linear algebra? Why can't I just multiply one basis "vector" by another to get a polynomial of the desired degree? – Dude Oct 14 '17 at 19:04
  • Of course, polynomials can be multiplied with each other, e.g., as functions in the usual point-wise way.

    But then you leave the algebraic setting of a vector space. If you talk about vector-space notions (basis, dimension) then you have to use vector-space definitions. A function-multiplication is not a scalar-multiplication, so you cannot use it in an argument about linear independency.

    – S. Huber Oct 14 '17 at 20:07
  • Why is it defined this way? What is the functional / conceptual reason for imposing this restriction for polynomials? – Dude Oct 15 '17 at 01:37
  • An algebra over a field adds multiplication to a vector space like you wish. Those have more structure, but are less general: Not all algebras are vector spaces. If you can prove a theorem using vector spaces only then your theorem is more general, more powerful. It makes sense to know the essential needs for a theorem or observation in order to deepen the understanding. The setting of a vector space has proven to be powerful and practical. – S. Huber Oct 15 '17 at 10:21