1

I have found the following problem from Golan's Linear Algebra.. book.

Let $\alpha,\beta:V \to W$ be two linear transformations between two vector spaces $V$ and $W$ defined over the same field $F$. If for each $v\in V$ there exists a scalar $c_v\in F$ (depending on $v$) such that $\alpha(v)=c_v \beta(v)$, then prove that there exists a scalar $c\in F$ such that $\alpha=c \beta$.

I first tried to define the map $f:V \to F$ by $f(v)=c_v$ and I guessed that this map will be linear, but I found that the map fails to be linear. My next attempt is to show that this map $f$ is constant but I am unable to justify this. I am confused how to approach this problem. A slight hint is required.

user149418
  • 2,386
  • $f(v)$ isn't well-defined if $\beta(v) = 0$, so your approach will require a slight modification. Factor both $\alpha$ and $\beta$ through $V/\ker \beta$, so you can assume $\beta$ is injective. Then replace $W$ with $\operatorname{im} \beta$, so you can assume $\beta$ is bijective. Next compose with $\beta^{-1}$ on both sides, so you can in fact assume $\beta = \operatorname{id}$. Finally, assume $v,v' \ne 0$ and prove that $c_v = c_{v'}$ by distinguishing two cases, according to whether $v$ and $v'$ are linearly independent. – user208259 Jan 24 '15 at 12:15
  • @user208259: What is meant by factor both $\alpha$ and $\beta$ through $V/\ker {\beta}$? Can you explain more explicitly? – user149418 Jan 24 '15 at 14:57

2 Answers2

2

This problem can be broken into two steps.

The first step is to define a linear operator $\phi$ on the subspace $\def\Im{\operatorname{Im}}W'=\Im(\beta)\subseteq W$. which can be done as follows. Clearly the requirement $\alpha(v)=c_v \beta(v)$ for all $v$ implies $\ker\beta\subseteq\ker\alpha$, so $\alpha$ factors through a unique map $\overline\alpha:V/\ker\beta\to W$. Also $\Im(\alpha)\subseteq\Im(\beta)=W'$, so one can define $\phi$ to be the composite map $W'\to V/\ker\beta\to W'\subseteq W$ where the first is the (inverse of the) isomorphism given by the (first?) isomorphism theorem, and the second map is given by$~\overline\alpha$.

Now the hypothesis gives that all nonzero vectors of $W'$ are eigenvectors of$~\phi$ (to be precise $\beta(v)$ is eigenvector for eigenvalue$~c_v$), and the second step is to show that this implies that $\phi$ is a multiple of the identity (that is, all the $c_v$ are equal). This step is dealt with here. One easy way to see it is that using the general fact that a sum of distinct eigenspaces is always direct; if there were at least two (nonzero) eigenspaces then their sum would contain non-eigenvectors. So there is at most one eigenspace, and all of $W'$ must be that eigenspace.

1

Here's a way to handle the problem:

First, consider the case that $\beta = 0$. Then, $\alpha = 0$ must hold trivially. So, we assume that this is not the case.

Then, there exists a basis $\{v_j\}_{j \in J}$ of the kernel of $\beta$. Extend this to a basis $\{v_j\}_{j \in I}$ of $V$, so that $J \subsetneq I$. We note that $\{\beta(v_j)\}_{j \in I \setminus J}$ forms a basis of the image of $\beta$.

Define $c_j = c_{v_j}$. By the above argument, $\alpha$ and $\beta$ are both $0$ on $\ker(\beta)$. So, fix a $k \in I \setminus J$. We note that for $j \in I \setminus J$, $$ \alpha(v_j + v_k) = c_j \beta v_j + c_k \beta v_k = c_j(\beta (v_j + v_k)) + (c_k - c_j) \beta v_k $$ However, by our assumption, $\alpha(v_j + v_k)$ is a multiple of $\beta(v_j + v_k) = \beta v_j + \beta v_k$. This necessarily implies that $(c_k - c_j) = 0$.

Moreover, if $j \in J$, we have $$ \alpha(v_j) = c_j \beta v_j = c_j 0 = 0 = c_k 0 = c_k \beta v_j $$

Thus, for any $j \in I$, $\alpha(v_j) = c_k \beta(v_j)$. Defining $c = c_k$, we note that for all $j \in I$, $\alpha(v_j) = c \beta(v_j)$.

Thus, $(\alpha - c\beta)v_j = 0$ for any $j \in I$. Since a linear transformation is determined by how it acts on a basis, we conclude that $\alpha - c \beta = 0$. That is, $\alpha = c\beta$, as desired.

user149418
  • 2,386
Ben Grossmann
  • 225,327
  • :Why $(c_k-c_j)=0$? – user149418 Jan 24 '15 at 18:38
  • I made a mistake there... I assumed that $\beta v_k$ is linearly independent to $(v_j + v_k)$, which I can't assume here. I'll try to fix this. – Ben Grossmann Jan 24 '15 at 18:44
  • See my edit ${}{}$ – Ben Grossmann Jan 24 '15 at 18:58
  • This is somewhat along the lines of what "factoring through the quotient" means. Basically, $f$ can only be uniquely determined if $\beta$ is invertible. We can also determine $f$ for the induced map of $\beta$ on $V/\ker(\beta)$, since this induced map is invertible. – Ben Grossmann Jan 24 '15 at 19:00
  • I meant "injective", not "invertible" in the above. It will probably be a good exercise to try to reframe the proof I've presented in those terms. A more explicit explanation of "factoring": note that there exists a map $\tilde \beta$ (which is injective!) such that $\beta = \tilde \beta \circ \pi$, where $\pi$ is the projection from $V$ to $V / \ker \beta$. We can also find an $\tilde \alpha$ such that $\alpha = \tilde \alpha \circ \pi$. – Ben Grossmann Jan 24 '15 at 19:04
  • @ Omnomnomnom: Why ${v_j}_{j\in I\setminus J}$ forms a basis for Image of $\beta$? – user149418 Jan 24 '15 at 19:15
  • Whoops, typo. Does that make sense now? – Ben Grossmann Jan 24 '15 at 19:22
  • @ Omnomnomnom: I think some subset of ${\beta(v_j)}_{j\in I\setminus J}$ is a basis for Image of $\beta$. – user149418 Jan 24 '15 at 19:50
  • The vectors must be linearly independent, or else we're left with another vector in the kernel, contradicting our definition of $J$. – Ben Grossmann Jan 24 '15 at 19:52
  • @ Omnomnomnom: Oops.. I understand. If for some $j,l\in I\setminus J$, $\beta(v_j)$ and $\beta(v_l)$ are linearly dependent then we obtain a non-zero linear combination of some basis vectors which is absurd. Thanks. – user149418 Jan 24 '15 at 20:03
  • @ Omnomnomnom: There is a typo in the third line from last. $\alpha(v_j)$ should be equal to $c_k \beta(v_j)$ and so $\alpha(v_j)=c\beta(v_j),\forall j$ – user149418 Jan 24 '15 at 20:10
  • Ah, good catch. That's right. – Ben Grossmann Jan 25 '15 at 01:26