1

Suppose we have a vector space $V = (K, +, \cdot)$. Let $B$ be a basis for $V$. Now we take an arbitrary square matrix $S \neq 0$.

$BS$ is just a linear combination of $B$. Thus $BS$ should be a new basis. Am I right?

user642796
  • 52,188
null
  • 237

2 Answers2

2

Your question is open to several interpretations. (At least the current revision.) And we could write quite a lot about each of these interpretations.

You are using notation $BS$, which is (as far as I can say), most frequently use to denote the matrix product. But you also write that $B$ is a basis. Basis is not a matrix. So we could dismiss your questions as nonsensical. Or we could try to look whether we can interpret it in a way in which it makes sense.

Matrix interpretation:

Let us assume that you are working with the vector space $V=F^n$ over a field $F$. Then the vectors of this vector space are $n$-tuples.

If we have some basis $\vec b_1,\dots,\vec b_n$, then we can put the vectors of this basis in a matrix $B$. Now it makes sense to make the product $BS$ of the two $n\times n$ matrices. And your question can be interpreted as the question whether the columns of the matrix $BS$ again for a basis of $F^n$. (I chose column vectors rather than row vectors, since the wording of your question seems to indicate that you are used to working with column vectors.)

The notion of invertible matrix is useful for this. If $B$ is a $n\times n$-matrix over a field $n$, then the following conditions are equivalent:

  • The matrix $B$ is invertible, i.e., there exists a matrix $A$ such that $AB=BA=I$.
  • Determinant of $B$ is non-zero, i.e., $\det B\ne0$.
  • The matrix $B$ has full rank, i.e., $\operatorname{rank}(B)=n$.
  • The rows of the matrix $B$ form a basis of $F^n$.
  • The columns of the matrix $B$ form a basis of $F^n$.

You can find a few more equivalent condition in the Wikipedia article I linked above. With this in mind, your question can be understood as:

When is the product of an invertible matrix $B$ and a square matrix $S$ again an invertible matrix.

The answer is that this is true if and only if $S$ is invertible. Indeed, we have $$S=B^{-1}(BS)$$ so $S$, as a product of two invertible matrices, is invertible.

You can find several posts on this site showing that product of invertible matrices is invertible (or some equivalent statement). For example:

Linear combination of columns:

From the formulation of your question it seems that you have noticed a useful observation about multiplication of matrices. Columns of the matrix $BS$ are linear combinations of the columns of $B$. And coefficients of these linear combinations are given by the entries of $S$. (Similar observation for rows is described here.)

With this in mind we can make your question into a question which makes sense for every finitely-dimensional vector space $V$, not only $V=F^n$.

Suppose $\vec b_1,\dots,\vec b_n$ is a basis of $V$. Let $S$ be an $n\times n$-matrix. Let us define vectors $\vec c_1,\dots,\vec c_n$ as $$\vec c_k = \sum_{i=1}^n s_{ik} \vec b_k.$$ For which matrices $S$ is $\vec c_1,\dots,\vec c_n$ a basis?

If $V=F^n$, then this is precisely the question about matrices, since the vectors we described here are precisely the columns of the matrix $BS$. However, this formulation makes sense for any basis.

The answer to this question is again that this is true if and only if $S$ is invertible.

For more about this, you can have a look at matrix representing change of basis.

Linear combination of rows:

We could ask similar question as above, but we could put vectors of the basis into row of a matrix. This does not change the answer. But in this case we can make a simple argument like this: We are basically asking whether the linear transformation $\vec x\mapsto \vec x S$ transforms a basis into a basis. (Since if we denote $\vec b_1,\dots,\vec b_n$ the rows of $B$, then the rows of $BS$ are $\vec b_1S,\dots,\vec b_nS$.)

The linear map $\vec x\mapsto \vec xS$ maps a basis to a basis if and only if it is a linear isomorphism. And equivalent condition is that $S$ is an invertible matrix.

0

To get a new basis the matrix $S$ should have nonzero determinant.

Leox
  • 8,120