1

In Friedberg, Ingel, and Spence Linear Algebra (4th Edition), defined left-multiplication transformation as the following:

Let $A$ be an $m × n$ matrix with entries from a field F. We denote by $L_A$ the mapping $L_A: F^n \rightarrow F^m $ defined by $L_A(x) = Ax$ (the matrix product of $A$ and $x$) for each column vector $x$ $\epsilon$ $F^n$. We call $L_A$ a $\mathbf{\text{left-multiplication transformation}}$.

The answers [here][1] and [here][2] tries to explain left-multiplication transformation.

My question is:

i) If someone can explain this definition in more layman terms? For some reason this definition is not very clear to me and the 2 links also didn't help much.

ii) Why we need a new definition of left-transformation matrix for $\mathbb{R}$? We already proved "all" linear transformation can be associated with a matrix, say $A$, if ordered basis can be defined. So, I can't find the utility of new definition.

iii) From the definition it seems linear map is "contingent" on $A$ ($A$ is defined independent of linear map), what's guarantee that dimension of matrix associated with linear map, $[T]_\beta^\gamma$, defined on $\mathbb{R}$ and dimension of $A$ will be same?

Thanks! [1]: What exactly is a left-multiplication transformation? [2]: Linear Transformations and Left-multiplication Matrix

Beta
  • 311
  • What do you mean by the “dimension of a linear map” and the “dimension of $A$”? Those expressions don’t make sense. – littleO Aug 25 '21 at 07:27
  • @littleO: Let $T: V \rightarrow W$ be a linear map and $\beta$ and $\gamma$ be ordered basis of vector space respectively. Then $[T]_\beta^\gamma$ I'm saying as "dimension of a linear map". – Beta Aug 25 '21 at 07:33
  • That also doesn't seem to make sense. $[T]_\beta^\gamma$ is a matrix. Why would you call it the "dimension" of a linear map? Dimension is not the correct word. – littleO Aug 25 '21 at 07:34
  • @littleO: I updated my question based on your suggestion. Hopefully this time my question is more accurate. Thanks! – Beta Aug 25 '21 at 07:38
  • That's an improvement. It's still unclear what you mean by the "dimension" of a matrix. Do you mean the shape of the matrix? What is the "dimension" of the matrix $\begin{bmatrix} 1 & 2 & 3 \ 4 & 5 & 6 \end{bmatrix}$, for example? See the comments here. "A matrix has no 'dimension'. It has a 'size'." (I prefer the term "shape", personally.) – littleO Aug 25 '21 at 07:40
  • @littleO: Yes. So, say $A's$ dimention is $k \times l$ and matrix associated with linear map $T$ , $[T]_\beta^\gamma$ dimension is $m \times n$. Why $k = m$ and $l = n$? This is the best way I can simplify my question. Thanks! – Beta Aug 25 '21 at 07:44
  • Essentially, left versus right multiplication boils down to whether the linear transformation $AB$ means "First apply $A$, then $B$" (right multiplication, because it would be natural to write $xAB$) or "First apply $B$ then $A$" (left multiplication, because then it would be natural to write $ABx$). – Arthur Aug 25 '21 at 08:46

1 Answers1

1

Have a look at this old answer of mine, the diagram in particular. This should hopefully be something familiar to you.

The idea is that we wish to describe transformations from abstract $n$-dimensional space $V$, to abstract $m$-dimensional space $W$, in more familiar, computable terms. If we fix a basis $\beta$ for $V$, this gives us an isomorphism between $V$ and the space $F^n$, which takes abstract vectors $v \in V$, and transforms it into the column vector $[v]_\beta \in F^n$. This turns the mysterious, abstract, possibly difficult to work with space $V$ into a familiar space of column vectors. Addition in $V$ corresponds to adding these column vectors, and similarly for scalar multiplication. We can completely understand $V$, by looking at only coordinate vectors instead.

Similarly, fixing a basis $\gamma$ for $W$ similarly gives us an isomorphism $w \mapsto [w]_\gamma$ from $W$ to $F^m$. In much the same way, we can understand the abstract vector space $W$ concretely in terms of column vectors.

This also means that linear transformations from $V$ to $W$, which again can be quite abstract, can be concretely understood as linear transformations between $F^n$ and $F^m$ (once bases $\beta$ and $\gamma$ are fixed).

The nice thing is that linear transformations between $F^n$ and $F^m$ can be expressed as multiplication by unique $m \times n$ matrices. This is what this definition is trying to establish. This step is important: we need not only to establish a correspondence between linear maps $T : V \to W$ and linear maps $S : F^n \to F^m$, but between linear maps $T : V \to W$ and $m \times n$ matrices. Both connections are important to establish this.

There needs to be two directions to this: we need to show that a linear map from $F^n$ to $F^m$ can be expressed as multiplication by an $m \times n$ matrix, and that multiplication by an $m \times n$ matrix is always a linear map from $F^n$ to $F^m$. The latter is what is about to be established. Without showing that $L_A : F^n \to F^m$ is linear, then all we know is that linear maps between $V$ and $W$ correspond to some $m \times n$ matrices. What if certain $m \times n$ matrices turn out to be out-of-bounds?

They're not. As it turns out, $L_A$ is linear, just by standard distributivity and associativity properties of matrices, e.g. $$L_A(x + y) = A(x + y) = Ax + Ay = L_A(x) + L_A(y).$$ This and the scalar homogeneity argument imply that $L_A$ is always a linear map.

Here is an example to show you how this definition works. Suppose we pick arbitrarily a matrix like $$A = \begin{pmatrix} 1 & -1 \\ 0 & 0 \\ 2 & -2\end{pmatrix}.$$ Then, $A$ is $3 \times 2$, and so $L_A$ should be a linear map from $\Bbb{R}^2$ to $\Bbb{R}^3$. By definition, $$L_A\begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} 1 & -1 \\ 0 & 0 \\ 2 & -2\end{pmatrix}\begin{pmatrix} x \\ y \end{pmatrix} = \begin{pmatrix} x - y \\ 0 \\ 2x - 2y\end{pmatrix}.$$ Hopefully you can see that this is a linear transformation, and if you were to take the standard matrix for this linear transformtion, you would simply get $A$. You can do this with any $A$, helping prove that matrix multiplication is equivalent to general linear transformations between finite-dimensional spaces.

Theo Bendit
  • 50,900
  • Thanks for your brilliant answer! This is very useful. But there are couple of statement here that I didn't understand. i) "If we fix a basis $\beta$ for $V$, this gives us an isomorphism between $V$ and the space $F^n$. $F$ is a field, say real-numbers, and if $V$ is vector space of polynomials. Then how fixing basis establish relation between $V$ and $F^n$? – Beta Aug 25 '21 at 08:17
  • ii) "we need not only to establish a correspondence between linear maps $T:V \rightarrow W$ and linear maps $S:F^n \rightarrow F^m$, but between linear maps $T:V \rightarrow W$ and $m \times n$ matrices": We already established relationship between $T$ and matrix representation $m \times n$. So, why we need to establish it? – Beta Aug 25 '21 at 08:20
  • Where we "establish a correspondence between linear maps $T:V \rightarrow W$ and linear maps $S:F^n \rightarrow F^m$" in this definition? Sorry too many questions! But thanks a lot again! – Beta Aug 25 '21 at 08:21
  • 1
    @Beta Regarding your first question, we establish a relation by way of coordinate vectors. If you fixed the ordered basis $(x, 1, x^2)$ for $P_2(\Bbb{R})$ (yes, I chose a bad order just to be annoying!), then this yields a map taking $ax^2 + bx + c$ to its column vector $\begin{pmatrix} b \ c \ a\end{pmatrix}$. This map is an isomorphism, meaning that it is linear and invertible, so it respects addition and scalar multiplication, as does its inverse. Choosing a different basis will produce a difference correspondence. By fixing a basis, we can understand $P_2(\Bbb{R})$ as $\Bbb{R}^3$. – Theo Bendit Aug 25 '21 at 08:22
  • Perfect!!! Thanks a ton! – Beta Aug 25 '21 at 08:23
  • 1
    ii) That's kind of skipping a step. If we understand $V$ as $F^n$ and $W$ as $F^m$, then we can naturally understand $T : V \to W$ as a map from $F^n$ to $F^m$. These linear maps can be understood as matrix multiplication. I find it useful to make this distinction, because there are many ways to go from $V$ and $W$ to $F^n$ and $F^m$, but only one way to make a map $F^n \to F^m$ into a matrix. Plus the methods for stepping from abstract linear map to concrete linear map is quite different from stepping from concrete linear map to matrix. – Theo Bendit Aug 25 '21 at 08:26
  • Thanks! This make sense too now. – Beta Aug 25 '21 at 08:28
  • Do you still want me to answer (iii)? I can, but I think if you understand the other two, your doubts about (iii) should be clearing now. – Theo Bendit Aug 25 '21 at 08:28
  • 1
    no :). It's perfectly clear now. Thank you Theo for clearing my doubts. I was struggling with it for quite sometime. – Beta Aug 25 '21 at 08:30