0

I am new to linear algebra, so I've got a little confused when I discovered matrices are vectors. Who can share the knowledge to someone fresh to this discipline as me?

  • 4
    Being a vector is not an intrinsic property of an object. It is a "role", so to speak, that an object plays. If you have a set of objects, and two operations on it (addition and scalar multiplication), and if those two operations satisfy the axioms of a vector space, then we call that set together with those two operations "a vector space" and we call the elements "vectors". They are not "vectors" because they have some "vectorness" property, rather, they are vectors with respect to the other elements of the set, and to the two operations. –  Oct 08 '22 at 17:04
  • 1
    Any object $x$ can be a vector in its own $1$-element set ${x}$ with respect to the addition defined as $x+x:=x$ and scalar multiplication defined as $\alpha\cdot x := x$. That element plays the role of a "zero vector" in this little "vector space". This is obviously a contrived example, but it teaches you that the important thing is not whether something "is a vector" but how it relates to other elements with respect to two pre-defined operations (addition and scalar multiplication). –  Oct 08 '22 at 17:08
  • 2
    The term “vector” has different meanings in different contexts. In some contexts a vector is defined to be an ordered $n$-tuple of real numbers, or perhaps an ordered $n$-tuple of numbers belonging to some field $F$ (such as $\mathbb R$ or $\mathbb C$). Alternatively, when working with any vector space (whose elements might be, say, functions or matrices or polynomials) the elements of that vector space might be called “vectors” even though they aren’t ordered $n$-tuples of numbers. – littleO Oct 08 '22 at 22:04
  • @littleO Interesting. Thanks for the comment! –  Oct 08 '22 at 23:53
  • 1
    While not the top answer, see my response here: https://math.stackexchange.com/a/1937478/171839 – Sean Roberson Oct 09 '22 at 01:00

3 Answers3

1

In the context of Matrix algebra a matrix is a rectangular array of elements (usually numbers in a field). Certain operations are defined on matrices. The definition of matrix multiplication seems unnatural and unmotivated. Certain matrices are called row vectors and column vectors. Thus, these special matrices can be called vectors in this context.

In the context of axiomatic linear algebra, a vector is an element of a space which has certain operations of vector addition and multiplication by scalars defined on it. This is a very general concept. In particular, there is a vector space of all $\,n\times m\,$ matrices with matrix addition and multiplication by scalars. In this context, the matrices are all called vectors. Thus, the question of the connection between matrices and vectors depends on the context.

Somos
  • 35,251
  • 3
  • 30
  • 76
0

Chances are that from analytic geometry you have a concept of “vectors are groups of numbers written below or next to one another” or some such. That's often enough for most geometric applications.

And based on your question I'm assuming your linear algebra course is taking a more axiomatoc approach. A vector is an element of a vector space, where a vector space is something that satisfies a bunch of axioms. I'll encourage you to read those axioms, but basically it says that you can add two vectors in a way that makes sense, and you can take a scalar from the underlying field (e.g. the real numbers) and multiply that with a vector to get a scaled vector, and that addition and this scalar times vector multiplication follow certain distributive laws as well.

So from those axioms, matrices behave like vectors. You can add them, you can scale them, and you will find that all the axioms are indeed satisfied. This ignores the fact that a matrix tends to be more than a vector. A matrix has a certain shape (how many rows by how many columns), and certain associated operations (such as matrix times vector products, or matrix times matrix) which are specific to matrices and not generally something a vector would do. So I would say that a matrix is behaving like a vectors space element, but with extra properties beyond that.

How do the two views of the world relate? On the one hand, you can show that the notion of “a vector is a bunch of numbers” satisfy the axioms of a vector space, if you use the regular operations on them which you're likely familiar with. So in that sense, the “bunch of numbers” view is a specific case from the more general class of things that behave as vectors.

But there is also a connection the other way round. You may be able to find a basis of your vector space, that is a bunch of vectors so that all other vectors are linear combinations of these. So if you have basis vectors $b_i$ and every element of your vectors space can be written as $\alpha_1b_1+\alpha_2b_2+\cdots+\alpha_nb_n$ then every vector can be uniquely identified using these numbers $\alpha_i$. Thus by writing up those numbers, you can characteriye every vector uniquely. The numbers represent the vector, in a certain sense they are the vector. And now we've come full circle.

Note that not every vectors space has a basis. In particular a vector space may have infinite dimensions, in which case you can't find a finite number of basis vectors for it. For example functions $\mathbb R\to\mathbb R$ can be seen as vectors in this sense. So the concept of a vector according to the axioms still entails more than the “bunch of numbers” view would suggest.

On the one hand, dealing with numbers is very useful for many practical applications. So from that perspective I'd call the “bunch of numbers” view an application-oriented perspective. Conversely the “fits the axioms” focuses on the bare minimum of properties required to draw conclusions, which has the benefit of making these conclusions very widely applicable. Maybe you can solve some problems for matrices by considering them as vector space elements. In general this is more the theoretical, pure maths approach.

Personally I wouldn't attach the different focus to linear algebra vs. analytic geometry. It's not that the fields have actually different definitions of what a vector is. But the focus, which of the properties are more important, is probably different. So you might see different emphasis, and unless you see the connections between the views, they may appear as different concepts.

MvG
  • 42,596
0

$\begin{bmatrix} 2 & 0 \\ 0 & 2\end{bmatrix}+\begin{bmatrix}3 & 0 \\ 0 & 3 \end{bmatrix}=\begin{bmatrix}5 & 0 \\ 0 & 5 \end{bmatrix}$ and $3\times \begin{bmatrix} 2 & 0 \\ 0 & 2\end{bmatrix}=\begin{bmatrix} 6 & 0 \\ 0 & 6\end{bmatrix}$

I presented an addition of matrices and the multiplication of a matrix by a scalar. This is why, as explained to you, we say that matrices are "vectors". No more no less. Note that the $2\times2$ matrices can be interpreted as dilation matrices in $\mathbb{R}^2$. We can also multiply the matrices between them, for example with the matrices $\begin{bmatrix} 1 & 0 \\ 0 & -1\end{bmatrix}$ and $\begin{bmatrix} 1 & 1 \\ 1 & -1\end{bmatrix}$, which represent similitudes. And there, it will become more interesting since the product is NOT commutative. The structure which then appears will be much more interesting to you than the simple vector space structure of matrices. In short, a little patience... :)

Stéphane Jaouen
  • 2,954
  • 5
  • 21