1

In the book of Linear Algebra by Werner Greub at page 95, question 2,

Assume that $\phi$ is a linear transformation $E\to E$ having the same matrix relative to every basis $x_v$.Prove that $\phi = \lambda i$, where $\lambda $ is scalar, and $i$ is the identity map.

Let $A$ be the matrix representation of $\phi$ respect to basis $x_v$ and $B$ respect to the basis $y_v$, and C be the basis transformation $x_v \to y_v$. I have derived that

$$AC = CA = CB = BC$$, but after that I stuck.

Actually, as a method I don't know how to show the result, so I tried things to get some feeling what is going on, but, as I have said, it didn't go nowhere.

So how can we show this result ? I would appreciated if you give some hint, but if you directly give the answer, it is OK too.

Edit:

We are working on a give $\phi$ such that its matrix representation $M(\phi; x_v, x_u)$ is the same for any basis $x_v$.

Our
  • 7,285
  • Hint: You already know that your matrix $A$ commutes with all invertible matrices. Show that it commutes with all matrices. (Sub-hint: the matrices commuting with $A$ form a vector space.) Conclude that you can extend your base field. Use loup blanc's argument. (There might be a simpler way, but I don't see it right now.) – Pierre-Yves Gaillard Jun 27 '17 at 15:25
  • @ Pierre-Yves Gaillard , how do you extend the base field ? See also my EDIT in my answer. –  Jun 27 '17 at 16:50
  • @loupblanc - I agree that your argument is much better than mine. (My argument was: if your matrix commutes with all matrices with coefficients in $K$, it commutes with all matrices with coefficients in any extension of $K$, for instance in $K(X)$. Then you can use your first argument.) (Detail: I wasn't notified about your comment, perhaps because you put a space between the @ and my name.) – Pierre-Yves Gaillard Jun 27 '17 at 17:24
  • @Pierre-YvesGaillard Sir please tell me how we know that given matrix $A$ commutes with all invertible matrices? – Akash Patalwanshi Apr 22 '19 at 13:05
  • 1
    @AkashPatalwanshi - If $\phi$ is an endomorphism of an $n$-dimensional vector space $V$ over a field $K$, and if $A$ is the matrix of $\phi$ relative to a certain basis of $V$, then the matrix of $\phi$ relative to another basis will be of the form $BAB^{-1}$, where $B$ is an invertible $n$ by $n$ matrix with coefficients in $K$. Moreover the equalities $BAB^{-1}=A$ and $BA=AB$ are equivalent. – Pierre-Yves Gaillard Apr 22 '19 at 16:20
  • @Pierre-YvesGaillard Sir, first of all thanks for the reply. In your reply to me, the matrix $B$ is the transition matrix(change of basis matrix) from one basis to another basis of given vector space, "it is not any arbitrary invertible matrix". So how can we conclude that the given matrix commutes with arbitrary invertible matrix? – Akash Patalwanshi Apr 22 '19 at 16:41
  • 1
    @AkashPatalwanshi - Let us identify a basis of $V$ with the corresponding isomorphism $K^n\to V$, and let us also identify $n$ by $n$ matrices and endomorphisms of $K^n$. If $x:K^n\xrightarrow\sim V$ and $y:K^n\xrightarrow\sim V$ are two basis, then the transition matrix is $T:=x^{-1}\circ y$. You can write this as $y=x\circ T$. If you can start with any $x$ and any $T$, and define $y$ by the above equality, you'll get a transition matrix equal to $T$. – Pierre-Yves Gaillard Apr 22 '19 at 17:13
  • 1
    Here are some good answers of the same question (after you've observed $\phi$ commutes with everything: https://math.stackexchange.com/q/27808/293177 – Al.G. Dec 01 '20 at 17:52

4 Answers4

2

The following proof is valid over any field $K$ that has at least $3$ elements.

Assume that $\phi$ is not a scalar function. Then there is $x$ s.t. for every $\alpha\in K\setminus \{0\}$

$x,\alpha\phi(x),e_3,\cdots,e_n$ is a basis of $E$.

In such a basis, the matrix of $\phi$ has as first column: $[0,\dfrac{1}{\alpha},0,\cdots,0]^T$, a contradiction.

EDIT 1. A solution valid over any field $K$.

Let $A=[a_{i,j}]$ be a representative of $\phi$ and let $(E_{i,j})$ be the canonical basis of $M_n(K)$. As Pierre-Yves Gaillard wrote, for every $P\in GL_n(K)$, $P^{-1}AP=A$, that is $PA=AP$.

Method 1. In particular, for every $k\not= l$, $A(I_n+E_{k,l})=(I_n+E_{k,l})A$, that implies for every $k\not= l$, $a_{l,k}=0,a_{k,k}=a_{l,l}$. Finally, $A$ is a scalar matrix.

EDIT 2. Method 2. We can also use the fact that (over any field) any matrix is the sum of two invertible matrices. cf. the user1551's answer in

Real square matrix as a sum of two invertible matrices

  • Hi! You may want to look at my comment to the question. – Pierre-Yves Gaillard Jun 27 '17 at 15:27
  • +1. I think you agree that the correct statement is: $$(\forall k\ne l)\ \Big(A(I_n+E_{k,l})=(I_n+E_{k,l})A\Big)\implies(\forall k\ne l)\ (a_{l,k}=0,a_{k,k}=a_{l,l}),$$ not $$(\forall k\ne l)\ \Big(A(I_n+E_{k,l})=(I_n+E_{k,l})A\implies a_{l,k}=0,a_{k,k}=a_{l,l}\Big).$$ (I'm sure that's what you meant.) – Pierre-Yves Gaillard Jun 27 '17 at 17:15
  • By the way, what is the problem in my answer ? I missed the discussion in the deleted answer. – Our Jun 28 '17 at 12:36
1

What happens if you have two basis $(e_1,e_2,\ldots,e_n)$ and $(-e_1,e_2,\ldots,e_n)$?

1

Let $V$ be a finite dimensional vector space over a field $K$, and let $a$ be an endomorphism of $V$ commuting with all automorphisms.

It suffices to show that $a$ is scalar.

Let $C$ be the set of endomorphisms of $V$ commuting with $a$.

Clearly $C$ is a linear subspace of $\operatorname{End}_K(V)$ containing the automorphisms.

If $b$ is a nilpotent endomorphism, then $\operatorname{id}_V+b$ is an automorphism, and $b=(\operatorname{id}_V+b)-\operatorname{id}_V$ is in $C$.

This implies successively that $C$ contains all the nilpotent endomorphisms, that $a$ preserves the kernel and the image of each nilpotent endomorphism, that $a$ preserves each linear subspace, and that any nonzero vector is an eigenvector (in particular $a$ is diagonalizable). Since the sum of two eigenvectors corresponding to different eigenvalues cannot be an eigenvector, we see that $a$ has exactly one eigenvalue.

  • 1
    well done; this proof doesn't use any basis. See also EDIT 2 in my answer. –  Jun 28 '17 at 13:55
  • Your proof contains some topics that haven't been covered in the book, so I will accept your answer in a month. – Our Jun 28 '17 at 15:27
0

Someone gave this answer when the question is first asked, but for some reason the user has deleted his/her answer though I used that the hint that s/he gave me, so I'm giving again.

Hint:

Consider the basis $x_v$ and its permutations.

Note that, we are dealing with ordered basis.

Solution:

Let for arbitrary $\sigma \& v$, $x_\sigma \& x_v$ be the first basis elements the ordered basis.Then

$$\phi (x_\sigma) = \sum_\lambda \beta_\sigma^\lambda x_\lambda$$

$$\phi (x_v) = \sum_u \alpha_v^u x_u$$, so $\alpha_vû = \beta_\sigma^u \quad \forall u$.Since $v \& \sigma$ was arbitrary, the matrix representation $M(\phi) = (\gamma_{i,j})$, where $\gamma_{i,j} = \alpha$ for all $i,j$.Thus, this directly implies that $\phi(x) = \alpha x \quad \forall x$

EDIT (By loup blanc). The above proof does not hold when $n=2$ and when a representative of $\phi$ is $A=\begin{pmatrix}0&1\\1&0\end{pmatrix}$. Indeed, there is only one non-trivial permutation of the elements of the basis: $\sigma=(1,2)$ and its associated permutation matrix is $A$ !! Finally, by this permutation, $A$ is transformed in $A^{-1}AA=A$; thus, there is no contradiction.

Our
  • 7,285