4

Question: Other than the zero map, what linear map has the same matrix $A_{E,F}$ with respect to all $E$ and $F$?

For linear map $T:\mathbb{R}^n \rightarrow \mathbb{R}^n$, given a basis $E$ for domain and basis $F$ for codomain, I can find a unique corresponding matrix $A_{E,F}$, where $A_{E,F}$ generally depends on $E$ and $F$.

Note that for any $E$ and $F$, the matrix corresponding to the zero map is always the zero matrix, since the map sends all vectors in $E$ to $\mathbf{0}$, and $\mathbf{0}$ can only be represented by $(0,...,0)$ with respect to any basis $F$.

Eric Chan
  • 143

4 Answers4

2

If $T \ne 0$, then there exists $v \in \mathbb R^n \setminus \{0\}$ such that $w = T(v) \ne 0$. There exists a basis $E =\{e_1,\ldots,e_n\}$ of $\mathbb R^n$ such that $e_1 = v$ and a basis $F =\{f_1,\ldots,f_n\}$ of $\mathbb R^n$ such that $f_1 = w$.

Write $A_{EF} = (a_{ij})$. Then $f_1 = w = T(v) = T(e_1) = \sum_{j=1}^n a_{1j}f_j$ which implies $a_{11} = 1$ and $a_{1j} = 0$ for $ j > 1$.

Now take the basis $F' = \{f'_1,\ldots,f'_n\}$ of $\mathbb R^n$ with $f'_j = \frac 1 2 f_j$ and write $A_{EF'} = (a'_{ij})$. Then $2f'_1 = w = T(v) = T(e_1) = \sum_{j=1}^n a'_{1j}f'_j$ which implies $a'_{11} = 2$ and $a'_{1j} = 0$ for $ j > 1$.

Thus $A_{EF} \ne A_{EF'}$.

Paul Frost
  • 76,394
  • 12
  • 43
  • 125
1

It is only the zero map.

For any linear map $T:\mathbb{R}^n \rightarrow \mathbb{R}^n$, given a basis $E$ for domain and basis $F$ for codomain, we can find a unique corresponding matrix $A_{E,F}$, where $A_{E,F}$ generally depends on $E$ and $F$.

We want that for any other bases $G, H$ for domain and co-domain, we should have $A_{G, H} = A_{E, F} $.

Suppose $E$ is represented in the canonical basis as $M_E$. Note that $M_E$ must be invertible, and that all invertible matrices $M_E$ correspond to some $E$. Suppose $A$ is the representation of the linear map in the canonical basis.

Then, something like this holds: $A_{E, F} = M_E A M_F $

That is, we want $A$ to be such that, for all invertible matrices $M_E, M_F, M_G, M_H$,
$M_E A M_F = M_G A M_H$
i.e. $M_E A M_F - M_G A M_H = O$

Suppose $M_E = 4I$, $M_F = 3I$, $M_G = 2I$ and $M_H = I$ - so that $M_E A M_F = 12A$, and $M_G A M_H = 2A$

That is, $M_E A M_F - M_G A M_H = 12A-2A = 10A = O $

That holds iff $A = O$.

Thus it must be the zero map.

whoisit
  • 3,069
0

Let $A$ be matrix of $T$ with basis on both the range and domain as the columns of the identity matrix $I$.

Then $A/2 $ the matrix with the same basis on the domain and $2I$ as the basis on the range.

So, $A/2 = A$, or $A=0$.

  • But doesn't that only work for bijective map T (so that T could be represented by identity matrix, which is invertible)? What if T is not bijective? – Eric Chan Oct 18 '22 at 11:03
  • No, the columns of identity matrix are the basis. The matrix $A$ may be singular. Essentially, $A= A_{I,tI} = A_{I,I}/t$ and let $t \to \infty$ – Arin Chaudhuri Oct 18 '22 at 11:13
0

The other answers are fine, but all assume that the ground field is $\mathbb{R}$. This assumption makes sense in light of the OP's formulation of the problem, but I wanted to provide the answer for all fields.

To that end, note that if $V$ and $W$ are both $1$-dimensional fields over $\mathbb{Z}/2\mathbb{Z}$, then the unique non-trivial linear map $V\rightarrow W$ has matrix $[1]$. Indeed, both $V$ and $W$ each have a unique basis. So, there are new possibilities not appearing in the other answers.

However, the final theorem is this:

Suppose $V$ and $W$ are vector spaces over the same ground field $k$ and let $f:V\rightarrow W$ be any linear transformation. Then the matrix of $f$ is independent of the choice of basis for $V$ and $W$ iff $f$ is identically $0$, or $k = \mathbb{Z}/2\mathbb{Z}$, $V$ and $W$ are of dimension $1$ over $k$, and $f$ is the unique non-trivial linear transformation.

Let's prove this. The "if" direction is easy: if $f=0$, then the argument given by the OP for $k=\mathbb{R}$ works just fine, and the remaining case $k = \mathbb{Z}/2\mathbb{Z}$, $\dim V=\dim W = 1$ was discussed above.

So, let's focus on the "only if" direction. Here is the setup. Suppose $V$ and $W$ are both vector spaces over a field $k$ and that $f:V\rightarrow W$ is a non-zero linear transformation which has a unique matrix. Our goal is to show that $k =\mathbb{Z}/2\mathbb{Z}$, and that $\dim V = \dim W = 1$.

To that end, fix $0\neq v\in V$ with $f(v)\neq 0$. If we extend $\{v\}$ to a basis of $V$, and extend $\{f(v)\}$ to a basis of $W$, the with respect to this matrix $f$ has first column $[1,0,...,0]^t$.

Now, assume for a contradiction that $k$ contains an element $x$ which is distinct from $0$ and $1$. Then since $x\neq 0$, $x^{-1}\in k$. So, we can modify our basis of $W$ by replacing $f(v)$ with $x^{-1}f(v)$. With respect to this modified basis, $f$ now has a matrix whose first column is $[x,0,..,0]^t$. Since $x\neq 1$, this is distinct from out starting matrix, giving a contradiction. Therefore, $k$ consists of just the elements $0$ and $1$. That is, $k = \mathbb{Z}/2\mathbb{Z}$.

If $\dim V \geq 2$, then any permutation of the basis $\{v, ...\}$ we had for $V$ must give the same matrix. But permuting this basis has the effect of permuting the columns of the matrix, so all the columns must match. This means that $f$ must be constant on the basis elements. If $u$ is a basis element other than $v$, we may replace $v$ with $u+v$ and still have a basis. But $f(u+v) = 2f(v) = 0$, so now the first column of the matrix of $f$ is all $0$s. This is a contradiction.

A similarly argument applies if $\dim W \geq 2$, except that a permutation of the basis $\{f(v),...\}$ corresponds to permuting the rows of he matrix. Hence all rows have to match. But we already know that if $\dim W\geq 2$, that the first column of the matrix of $f$ has the form $[1,0,...,0]^t$, so that the first and second rows of this matrix don't match.