-1

Determine if all matrices $A_{n\times n}$, so that for every $B_{n\times n}$ the equality $AB=BA$ is true, is a vector space with regular addition and multiplication with a scalar

I know that every diagonal matrix is in $A$ and also $0$ matrix is in $A$. They are all closed under addition and multiplication by a scalar, so it seems like it is a vector space, is there any exception that I'm missing that contradicts that statement?

amWhy
  • 209,954
  • but to tell that I have to prove that those diagonal matrices are the only ones that fall under that category. how can I prove that there are no others? the intuition makes sense, but I don't know how to approach prove that – knight5478 Jan 05 '23 at 11:36
  • 1
    https://math.stackexchange.com/questions/181761/linear-transformation-t-such-that-ts-st If you are asking to prove the fact I proposed, you can check this link. – poeplva19 Jan 05 '23 at 11:45
  • @DietrichBurde what do you mean? It's written after the bolded text "the group:" – knight5478 Jan 05 '23 at 11:50
  • @poeplva19 thank you! – knight5478 Jan 05 '23 at 11:51
  • @DietrichBurde Every vector space is an Abelian group. – poeplva19 Jan 05 '23 at 11:55
  • @DietrichBurde You are maybe considering the matrix multiplication as the operator to form a group structure? Here the operator is matrix addition. – poeplva19 Jan 05 '23 at 11:58
  • @DietrichBurde I said that because "only a vector space" is not correct. And the set is a group with addition regardless, it wouldn't make any difference if it said set instead of group or not. – poeplva19 Jan 05 '23 at 12:06
  • No, a set of matrices with some condition need not be a group at all (the neutral element could be missing). One has to prove this really first. Consider the set of matrices with left upper entry equal to $1$. This does not form a group under addition. – Dietrich Burde Jan 05 '23 at 12:54
  • 2
    @DietrichBurde I think that the asker is not familiar with group theory and merely used the term "group" as an informal synonym for "set". – Ben Grossmann Jan 05 '23 at 19:16
  • @BenGrossmann yes, pretty much. thank you. – knight5478 Jan 06 '23 at 09:49

4 Answers4

2

You were not asked to (and did not) prove that this set is the set of scalar $n\times n$ matrices.

You were asked to (but neither did) prove that it is a vector subspace (what you showed is that the set of diagonal matrices is a vector space but this does not help).

Here is a direct proof that this set, which we shall name $E$ rather than $A,$ is a subspace, let's say of $M_n(\Bbb R)$ if that is what you implicitely meant. $$E=\bigcap_{B\in M_n(\Bbb R)}E_B,\text{ where }E_B=\{A\in M_n(\Bbb R)\mid AB=BA\}$$ hence it suffices to prove that each $E_B$ is a vector subspace.

$0\in E_B$ and $\forall A,C\in E_B\quad\forall\lambda\in\Bbb R\quad A+\lambda C\in B,$ since $(A+\lambda C)B-B(A+\lambda C)=(AB-BA)+\lambda(CB-BC)=0+\lambda0=0.$

Anne Bauval
  • 34,650
  • 1
    yes, that's exactly what I needed, thank you! – knight5478 Jan 06 '23 at 12:42
  • 1
    When they said "They are all closed under addition and scalar multiplication" I thought they meant the set that was given in the question. Thought the diagonal thing was just mentioned to say $0$ was in the set. Sorry for that. – poeplva19 Jan 07 '23 at 16:26
  • @poeplva19 there seems to have been moreover some confusion between diagonal matrices and scalar ones. – Anne Bauval Jan 07 '23 at 17:44
1

You have to look for particular matrices B which allow you to get information on A.

Taking as B the matrix with $B_{1,1}=1$ and zeros everywhere else, you can notice that $AB$ is the matrix with the first column equal to that of A and the other cells equal to 0, while $BA$ is the matrix with the first row equal to that of A. So, the first row and the first column of A must be 0 except for the diagonal element. Repeating this reasoning for every $i=0,...,n$, we get the matrices B of the type $B_{i,i}=1$ and other entries equal to $0$, so we get that A must be diagonal.

Now, let $(a_1,...,a_n)$ be the diagonal of A. We know that $AB$ is the matrix which $i$-th row is given by $a_iB_i$, where $B_i$ is the $i$-th row of $B$. Regarding $BA$, we know that its $i$-th column is $a_iB^i$, where $B^i$ is the $i$-th column of $B$; so, its rows are given by the component-wise product of $(a_1,...,a_n)$ and the rows of $B$. So, we have: $$\forall \ i=1,...,n \ \ (a_1b_{i,1},...,a_nb_{i,n}) = (BA)_i = (AB)_i = (a_ib_{i,1},...,a_ib_{i,n}) $$ and this must holds for each matrix B, which implies that $a_i=a_j \ \forall \ i,j =1,...,n$.

So, we can conclude that the matrix you are looking for are that of the form $\lambda I$ for $\lambda \in \mathbb{R}$, where $I$ is the identity matrix. This clearly is a vector space since it is just the span of the matrix $I$.

SilvioM
  • 1,298
1

Let $E_{ii}$ be an $n \times n $ matrix with $1$ at location $(i,i)$ and $0$ elsewhere. Then $E_{ii} A = A E_{ii} \implies [\underline{0},...,\underline{0},a_{i,1:n},\underline{0},..,\underline{0}]^T = [\underline{0},...,\underline{0},a_{1:n,i},\underline{0},..,\underline{0}]$.

Hence $a_{ij} = 0$ except $i=j$. Hence $A$ is diagonal.

Assume then now $A$ is diagonal. Now Let $\underline{1}$ be a vector of $n$, $1$s. Then:

$[\underline{1},\underline{0},...,\underline{0}] A = A[\underline{1},\underline{0},...,\underline{0}] \implies a_{11} \underline{1} = \text{diag}(A)$.

This implies, $A = a_{11} I$. This is clearly a vector space.

So your set is $\{c I : c \in \mathbb{C}, I \text{ is } n \times n \text{ identity matrix }\}$. This is clearly a vector space.

Balaji sb
  • 4,357
0

I was not aware that I misused the term "group" and instead had to use the term "set" (I translated the question from my tongue).

eventualy, what I hat to do is to shot that for every $u,v\in A$ and every scalars $\alpha,\beta \in \mathbb{R} $, all those axiomes are setesfied:

Associativity of vector addition

Commutativity of vector addition

Identity element of vector addition

There exists an element $0 ∈ V$

Inverse elements of vector addition For every v ∈ V, there exists an element −v ∈ V, called the additive inverse of v, such that v + (−v) = 0.

Compatibility of scalar multiplication with field multiplication $a(bv) = (ab)v$

Identity element of scalar multiplication 1v = v, where 1 denotes the multiplicative identity in F.

Distributivity of scalar multiplication with respect to vector addition $a(u + v) = au + av$

Distributivity of scalar multiplication with respect to field addition $(a + b)v = av + bv$