2

There is a famous quote by Kaplansky on linear algebra:

We [he and Halmos] share a philosophy about linear algebra: we think basis-free, we write basis-free, but when the chips are down we close the office door and compute with matrices like fury. (Paul Halmos: Celebrating 50 Years of Mathematics.)

What does it mean to think and write basis free? Are there any "basis free" approaches to linear algebra?

Update: The comments below raise another question related to the above quote. How can one interpret studying systems of linear equations in a basis free way.

For an $m\times n$ matrix its reduced row echelon form (used in Gauss Jordan elimination) yields a unique basis of $\mathbb R^n$ (take the basis of corresponding row space and adjoin to it the basis of the corresponding null space). So each matrix $A$ has a special basis associated with it which does something nice: it solves $Ax=0$. How can one talk of these things in Halmos' viewpoint?

  • 2
    Notably, Axler's "Linear Algebra Done Right" attempts to go through linear algebra essentially without using matrices at all, that is, without focusing on the representation of a transformation with respect to a specific basis. – Ben Grossmann Nov 11 '16 at 12:20
  • 2
    Many (and perhaps most) of the important results of linear algebra can be thought of without ever mentioning the entries of a matrix, that is, without ever describing a transformation in terms of some basis of the underlying space. This is particularly important in functional analysis (linear algebra in infinite dimensions), where sensible bases are hard to come by. – Ben Grossmann Nov 11 '16 at 12:23
  • For a simple example, the answer to Why do Matrices work the way they do? is obvious if you know the basis-free ("geometric") viewpoint. This is a common question posed by students who do not understand the geometric viewpoint (i.e. abstract vector spaces). – Bill Dubuque Nov 11 '16 at 15:23
  • @Omnomnomnom: see the update above. Is it possible to study systems of linear equations without mentioning bases? –  Nov 12 '16 at 05:02
  • @shahab , well you can still talk about the Kernel/Null space of a linear operator and study its dimension etc. – air Nov 12 '16 at 06:10

1 Answers1

3

I think what is meant here is the authors prefer to present linear algebra in the way much of abstract algebra is presented. You have sets with a certain amount of structure i.e. groups, rings, division rings etc. and maps that preserve this structure, i.e. group homomorphisms, ring homomorphisms etc. In linear algebra, the relevant objects are elements of a vector space, and the mappings are linear mappings.

An illustrative example: one of the most important mappings in linear algebra, the determinant, can be defined in a "coordinate free" way, in addition to the way involving entries of a matrix.

For example, Apostol defines the determinant as an alternating form induced by a linear mapping $T$ from a vector space to itself. This essentially means giving an endomorphism is the same as giving this object, called the determinant, with the properties of the determinant you know (one dimensional, multilinear, alternating).

A personal note: coordinate free was how I learned linear algebra, and I think it was to my detriment; I wish people like Kaplansky kept their door open at least! Matrices are important, they show up a lot and I really think computations using them help one get a feel for linear algebra.

operatorerror
  • 29,103
  • I am stuck on parsing "This essentially means giving an endomorphism is the same as giving this object, called the determinant, with the properties …". Do you mean something like "every endomorphism gives rise to an object, called the determinant, with the properties …"? It certainly does not seem to me that specifying the determinant is the same as specifying the endomorphism. – LSpice Apr 18 '21 at 21:36
  • 1
    @LSpice Yes you are right, I definitely didn't use "same" precisely. I probably meant the one direction, as you say. Thanks! – operatorerror Apr 19 '21 at 01:51