7

$3 \times 3$ matrix B has eigenvalues 0, 1 and 2. Find the rank of B.

I understand that $0$ being an eigenvalue implies that rank of B is less than 3.

The solution is here (right at the top). It says that rank of B is 2 because the number of distinct non zero eigenvalues is 2.

This thread says that the only information that the rank gives is the about the eigenvalue $0$ and its corresponding eigenvectors.

What am I missing?


Edit
What I am really looking for is an explicit answer to this:

"Is the rank of a matrix equal to the number of distinct non zero eigenvalues of that matrix?"
Yes/No/Yes for some special cases

user26857
  • 52,094

3 Answers3

8

Take for example the matrix $A= \begin{bmatrix} 1 & 0 & 0 \\ 0 & 2 & 0 \\0 & 0 & 0 \end{bmatrix}$, its rank is obviously $2$, eigenvalues are distinct and are $0,1,2$.
We have theorem which says that if eigenvalues are distinct then their eigenvectors must be linearly independent, but the rank of the matrix is $n-1$ if one of this eigenvalues is zero.

Edit after question edit

To answer more generally we need Jordan forms.

Let $A$ be $n$-dimensional square matrix with $n-k$ non-zero eigenvalues
(I don't make here a distinction between all different values and repeated- if all are distinct then there are $n-k$ values , if some are with multiplicities then summarize their number with multiplicities to make full sum $n-k$) and $k$ zero eigenvalues.

Express $A$ through similarity with the Jordan normal form $A=PJP^{-1}$. The matrix $J$ can be presented as composition $J= \begin{bmatrix} J_n & 0 \\0 & J_z \end{bmatrix}$ where $J_n$ is a square part of Jordan matrix with $n-k$ non-zero values (which are eigenvalues) on diagonal and $J_z$ is a square part of Jordan matrix with $k$ zero values on its diagonal.

It is clear that because on the diagonal of upper-triangular matrix $J_n$ are non-zero values and the determinant as the product of these values is non-zero so $J_n$ has full rank i.e. $n-k$.

The rank of $J_z$ depends on the detailed form of this matrix and it can be from $0$ (when $J_z=0$) to $k-1$ ( see examples in comments). It can't be $k$ because $J_z$ is singular.

Therefore the final rank of $J$ can be from $n-k$ to $n-1$.
Similarity preserves rank so the rank of $A$ can be also from $n-k$ to $n-1$.

Widawensen
  • 8,172
  • Thank you for answering! If an eigen value is zero, i know that A has rank< 3, but how do I know for sure its '2' and not '1', or in general, n-2 or n-3 etc... – jumpmonkey Jun 29 '17 at 11:56
  • Because it has two dinstinct non-zero eigenvalues in this case. – Widawensen Jun 29 '17 at 11:59
  • I think I see it. This was my problem: how do I know for sure there is only one independent eigen vector with the eigen value 0. for example,if there's two vectors with eigen val=0, the rank of A would be 1. Since these are vectors in $R^3$ , there can be atmost 3 independent vectors, and we have 3 distinct eigen values giving 3 independent eigen vectors. So I can be certain that the eigen value 0 has only one vector. – jumpmonkey Jun 30 '17 at 08:01
  • have I got it right? – jumpmonkey Jun 30 '17 at 08:02
  • @jumpmonkey Yes, 0 in this case has only one eigenvector. There is no other possibility. – Widawensen Jun 30 '17 at 08:06
  • If I remember correctly, for the eigen values 0,0,1,1,1, you gave two example matrices with different ranks. So the answer to main question is: we cant tell rank just from looking at the number(algebraic multiplicity) of zero or non zero eigen values that it has. right? – jumpmonkey Jun 30 '17 at 08:11
  • @jumpmonkey Unfortunately the multiplicity of eigenvalues doesn't tell everything, the problem is more complicated. Really I could once again to give two examples of matrices with exactly the same eigenvalues and their multiplicities, but rank can be different. I think that this is closely related with so called Jordan forms of matrices, but to explain this in this thread would be a very substantial extension of the question... maybe you should once again to ask .. I'll post you examples – Widawensen Jun 30 '17 at 08:20
  • I will leave it at this. Thank you very much for your help. – jumpmonkey Jun 30 '17 at 08:26
  • 1
    $A= \begin{bmatrix} 1 & 0 & 0 & 0 & 0\ 0 & 1 & 0 & 0 & 0\0 & 0 & 1 & 0 & 0 \0 & 0 & 0 & 0 & 0 \0 & 0 & 0 & 0 & 0\end{bmatrix}$, $\ \ \ \text{rank}{\ 3}$. $ \ \ \ \ $

    $B= \begin{bmatrix} 1 & 0 & 0 & 0 & 0\ 0 & 1 & 0 & 0 & 0\0 & 0 & 1 & 0 & 0 \0 & 0 & 0 & 0 & 1 \0 & 0 & 0 & 0 & 0\end{bmatrix}$, $\ \ \text{rank}{\ \ \ 4}$

    • they have the same characteristic polynomial $\lambda^2(\lambda -1)^3$.

    Experiment with other this type matrices at http://wims.unice.fr/~wims/en_tool~linear~matrix.en.phtml.

    – Widawensen Jun 30 '17 at 08:27
  • Is this approach wrong: We know $0$ is an eigenvalue but multiple eigenvectors could be associated with it. The matrix can have a maximum of $n$ linearly independent eigenvectors (2 of them already associated with the distinct eigenvalues and so only one left to associate with $0$). This means the kernel must have dimension of 1 leading to rank of 2. This is what the other answer seems to say. I ask because your answer seems to go into greater depth. Is something missing in this approach? – sprajagopal Jan 21 '21 at 14:54
  • @sprajagopal if you have single zero eigenvalue certainly the kernel must be one-dimensional, however if zero is eigenvalue with greater multiplicity the rank depends on the form of Jordan bloc - it is the answer for additional part of question after edit. In the case of multiplicity of zero eigenvalue equal to 1, your reasoning is sufficient. – Widawensen Jan 21 '21 at 15:52
3

If the linear transformation has eigen-vectors with distinct eigenvalues, then they are linearly independent(check it, it's easy). Therefore, we know that $\lambda_1u $, $\lambda_2v\in Im(T)$, where $u$ and $v$ are the eigenvectors. Since they are non-zero and linearly independent (the fact that they are multiplied by constants does not change it) we know that $dim(Im(T)) \geq 2$.

  • Thanks for answering! It looks like you are talking about the dimension and rank of the span of eigen vectors, and not the matrix itself? (please check the edit to my question) – jumpmonkey Jun 29 '17 at 11:54
  • No, the rank is the dimension of the image of a linear transformation(what you call a matrix), therefore, when I find the vectors $lambda_1u$ and $lambda_2v$ in the image, I have found two linearly independent vectors in it. This means that the dimension of the image (rank) is at least two. But because zero is an eigenvalue, we know that the kernel has dimension one. From the theorem of kernel and image($dim(V) = dim(Ker(T))+dim(Im(T))$), we conclude rank = 2. – Francisco Maion Jun 29 '17 at 12:33
  • I was wondering how the kernel of $T$ is 1-dimensional... We know $0$ is an eigenvalue but multiple eigenvectors could be associated with it. I guess $T$ can have a maximum of $n$ linearly independent eigenvectors (2 of them already associated with the distinct eigenvalues and so only one left to associate with $0$) – sprajagopal Jan 21 '21 at 14:51
1

Because the eigenvalues are distinct, eigenvectors are distinct, and there is a eigen-space spanned by them. The resulting matrix is $$ \begin{bmatrix} 0 &0 &0 \\ 0 &1 &0 \\ 0 &0 &2 \\ \end{bmatrix} $$ And the rank is 2. A rank is the dimension of space spanned by all image in current space.

Violapterin
  • 1,725
  • Thanks for answering! I'm not sure what is 'image' or 'current space', but it looks like you are talking about the dimension and rank of the span of eigen vectors, and not the matrix itself? (please check the edit to my question) – jumpmonkey Jun 29 '17 at 11:51