2

Reading Friedberg, Ingel, and Spence's Linear Algebra (4th Edition) and I believe I've found a typo, but I want to make sure there is not some subtlety I'm not getting.

This is a simple true or false (Exercise 1(g) in Section 2.3) which I believe is True, but the back of the book says False.

Let $\mathsf{V},\mathsf{W}$ denote finite vector spaces and $\mathsf{V} \xrightarrow{\mathsf{T}} \mathsf{W}$ be a linear transformation and let $\mathsf{L}_A$ denote left-multiplication by $A.$

True or False? $\mathsf{T} = \mathsf{L}_A$ for some matrix $A.$

If this is false there is a theorem earlier that seems to contradict that by stating there is such a unique matrix.

Before I commit this to memory I just want to be certain.

  • 2
    Actually, the left multiplication by a matrix is associated to the pairs $(V,\mathcal B)$ and $(W,\mathcal C)$, where $\mathcal{B, ;C}$ are bases for $V$ and $W$. In other words, the matrix depends on the bases. It is not intrinsic. – Bernard Jun 05 '19 at 17:46
  • So it is because bases are not specified that this is false? – Marcus Helmer Jun 05 '19 at 17:47
  • I think so – there's not a single matrix associated to the linear transformation. – Bernard Jun 05 '19 at 17:49
  • 1
    They might be trying to get at the subtle matter of linear transformation versus matrix representation. Check out the top answer to https://math.stackexchange.com/questions/942894/difference-between-linear-transformation-and-its-matrix-representation – Math Helper Jun 05 '19 at 17:50
  • OK, then here's what I'm not understanding, $\mathsf{L}_A$ is also a linear transformation, not a matrix. So $A$ may change depending on the bases, but $A$ certainly exists no matter what bases we choose. – Marcus Helmer Jun 05 '19 at 17:54
  • 1
    But the linear transformation $L_A$ is only defined when you are working with vector spaces made up of tuples. Your $V$ and $W$ are not defined to be vector spaces of tuples. For example, if $V$ and $W$ were "polynomials with real coefficients of degree at most $2$", then you have lots of linear transformation from $V$ to $W$, but none of them are of the form $L_A$, because it never makes sense to left-multiply a polynomial by a matrix to get a polynomial. – Arturo Magidin Jun 05 '19 at 17:56
  • But every finite vector space over a field is isomorphic to the vector space of tuples (with same dimension) over the same field. That's the next section. – Marcus Helmer Jun 05 '19 at 18:00
  • 2
    But "isomorphic" is not the same thing as "identical to". The question is like saying "given two speakers, they always speak Russian to each other." What you are saying is "well, we can always translate whatever it is they are saying into Russian"... true, but not what the original statement said. (May I also add that books are not in the habit of giving you a trick question for which you need to read ahead as an exercise for the section you are in; if you think that the answer depends on stuff in the next section, then you are likely, though granted not necessarily, wrong) – Arturo Magidin Jun 05 '19 at 18:01
  • So then is that the subtlety the authors are going after? Because the theorem earlier that I said "contradicts" the true/false question does specifically refer to a linear transformation from $n$-tuples to $m$-tuples. – Marcus Helmer Jun 05 '19 at 18:05
  • @ArturoMagidin I think you are correct if you'd like to post that as answer. – Marcus Helmer Jun 05 '19 at 18:11

1 Answers1

3

First, remember that vector spaces are abstract constructs; they need not be "made up" of tuples. Some vector spaces, specifically those of the form $\mathbb{F}^n$ (where $\mathbb{F}$ is a field) are made up of tuples, but not all vector spaces are. You have vector spaces of polynomials (either all of them, or up to a certain degree), of matrices, of functions (continuous, differentiable), of sequences, etc.

Now, if you are working with vector spaces of tuples (in this case, written as columns), say $\mathsf{V}=\mathbb{F}^n$, $\mathsf{W}=\mathbb{F}^m$, then:

  1. Every $m\times n$ matrix $A$ determines a linear transformation $L_A\colon \mathsf{V}\to\mathsf{W}$ by "left multiplication by $A$", namely $$L_A\left(\begin{array}{c}a_1\\\vdots\\a_n\end{array}\right) = A\left(\begin{array}{c}a_1\\\vdots\\a_n\end{array}\right).$$

  2. Given an arbitrary linear transformation $T\colon\mathsf{V}\to\mathsf{W}$, there exists an $n\times m$ matrix $B$ such that $T=L_B$. This can be found by letting $\mathbf{w}_i=T(\mathbf{e}_i)$, and letting $B$ be the matrix $B=\Bigl( \mathbf{w}_1|\mathbf{w}_2|\cdots|\mathbf{w}_n\Bigr)$ whose columns are the vectors $\mathbf{w}_i$. Presumably, you have already seen this.

However, for an $n\times m$ matrix $A$, the linear transformation of the form $L_A$ only makes sense when the domain is $\mathbb{F}^n$ and the codomain is $\mathbb{F}^m$. Otherwise, you can't even "multiply $A$ by the vector".

Thus, for example, if $\mathsf{V}=\mathbf{P}_3(\mathbb{R})$, the vector space of polynomials with real coefficients of degree at most $3$ (plus the zero polynomial), and $\mathsf{W}=\mathbb{R}^2$, and $T\colon\mathsf{V}\to\mathsf{W}$ is given by $T(p(x)) = \left(\begin{array}{c}p(0)\\p'(1)\end{array}\right)$, then there is no matrix $A$ such that $T=L_A$, because you can't multiply a polynomial by a matrix to get a $2$-tuple. That's why the assertion in question is false.

That said, what is leading you astray is that given any finite dimensional vector spaces $\mathsf{V}$ and $\mathsf{W}$ over $\mathbb{F}$, and any linear transformation $T\colon\mathsf{V}\to\mathsf{W}$, there is a way to code the linear transformation $T$ using a basis $\beta$ for $\mathsf{V}$, a basis $\gamma$ for $\mathsf{W}$, and a matrix $A$; namely, pick a basis $\beta$ for $\mathsf{V}$, a basis $\gamma$ for $\mathsf{W}$; given a vector $\mathbf{v}$ of $\mathbf{V}$, let $[\mathbf{v}]_{\beta}$ be the coordinate vector of $\mathbf{v}$ relative to $\beta$; let $[\mathbf{w}]_{\gamma}$ be the coordinate vector of $\mathbf{w}$ relative to $\gamma$, and then define $A$ to be the (unique) matrix with the property that $$A[\mathbf{v}]_{\beta} = [T(\mathbf{v})]_{\gamma}.$$ This matrix is called the coordinate matrix of $T$ relative to $\beta$ and $\gamma$, which Friedberg, Insel, and Spence denote $[T]_{\beta}^{\gamma}$.

What is also true is that the map $f$that sends $\mathbf{v}$ to $[\mathbf{v}]_{\beta}$, and the map $g$ that sends $\mathbf{w}$ to $[\mathbf{w}]_{\gamma}$ are isomorphisms between $\mathsf{V}$ and $\mathbb{F}^n$ (where $\dim(\mathsf{V})=n$), and between $\mathsf{W}$ and $\mathbb{F}^m$ (where $\dim(\mathsf{W})=m$); and the matrix $A$ is such that the linear transformation $L_A$ fits into the commutative square: $\require{AMScd}$ $$\begin{CD} \mathsf{V} @>T >> \mathsf{W} \\ @VfV\cong V @V\cong V g V\\ \mathbb{F}^n @>L_A>>\mathbb{F}^m \end{CD}$$

But note that $T$ is not equal to $L_A$; it just corresponds to $L_A$, which is a different assertion.

As I noted in the comments, the True/False question is like saying "Given two speakers, they are talking Russian to one another." Your confusion lies in the fact that, whatever it is they are speaking, we can certainly translate what they are saying into Russian. But the fact that we can do that is not the same as asserting that they are speaking Russian. So the claim is false.

Arturo Magidin
  • 398,050
  • I've added your array to my flashcard. Beautiful and succinct visual explanation. Thank you. – Marcus Helmer Jun 05 '19 at 18:29
  • @MarcusHelmer You'll find it in the textbook; in the fourth edition, it is at the top of pp. 105, in section 2.4, Invertibility and Isomorphisms. – Arturo Magidin Jun 05 '19 at 18:31
  • I know, but you typed up the TeX for me in a way that is easily compiled by MathJax. – Marcus Helmer Jun 05 '19 at 18:34
  • @MarcusHelmer: There are better ways to do this diagram if you are working off-line (specifically, the xy package does nice commutative diagrams for $\LaTeX$), but they don't work very well with MathJax, so that's the best I can do here. – Arturo Magidin Jun 05 '19 at 18:38
  • 2
    @ArturoMagidin: the following code works slightly better if you want (really not a crucial thing though): `$\require{AMScd}$

    $$\begin{CD} \mathsf{V} @>T >> \mathsf{W} \ @VfV\cong V @V\cong V g V\ \mathbb{F}^n @>L_A>>\mathbb{F}^m \end{CD}$$` It gives: $\require{AMScd}$

    $$\begin{CD} \mathsf{V} @>T >> \mathsf{W} \ @VfV\cong V @V\cong V g V\ \mathbb{F}^n @>L_A>>\mathbb{F}^m \end{CD}$$ (See also: https://math.meta.stackexchange.com/q/2324/9464)

    –  Jun 05 '19 at 18:48
  • Really it works very well, look how beautiful that is http://imgur.com/gallery/2tAP4CT – Marcus Helmer Jun 05 '19 at 18:51
  • @Jack: Thanks!${}$ – Arturo Magidin Jun 05 '19 at 18:54