22

For a matrix $A$, if $\det(A)=0,$ prove and provide an example that at least one eigenvalue must be zero.

At first, I tried using the identity that the product of eigenvalues is the determinant of the matrix, so it follows that at least one must be zero for the determinant to be zero. Is this correct? Could I also prove it by using $(A-\lambda I)X=0$, for some $X\neq 0?$

If $\lambda=0,$ then we have $AX=0$, but I can't say $\det(A)\cdot \det(X)=0$ because $X$ is not a square matrix and doesn't have a determinant. How would I continue?

edward_d
  • 465

6 Answers6

42

Let $p(x)=\det(A-xI)$ the char. polynomial of $A$. Then $p(0)=\det(A)=0$, hence $0$ is a root of $p$ and therefore an eigenvalue of $A$.

Fred
  • 77,394
  • 2
    This feels a bit circular. How do you know that a root of $p$ is an eigenvalue of $A$? – JiK Jun 26 '18 at 05:47
  • 1
    @JiK: $A-\lambda I$ is singular iff $\det (A-\lambda I ) = 0$. – copper.hat Jun 26 '18 at 06:21
  • @copper.hat I'm not sure I follow. Are you saying that the proposed approach is to start from the fact that $\lambda$ is an eigenvalue iff $A-\lambda I$ is singular, then say that this is equivalent to $\det (A - \lambda I) = 0$, then define the characteristic polynomial, and finally see that $0$ is its root? That's a bit convoluted, and the way this answer is written doesn't really explain what follows from what. – JiK Jun 26 '18 at 06:35
  • 1
    I reconstruct the implied argument like this, @JiK: the characteristic polynomial of $A$, defined by $p(x) = det(A - xI)$, is a polynomial with the eigenvalues of $A$ for its roots. $p(0) = det(A - 0) = det(A)$, so when $det(A) = 0$, $0$ is a root of the characteristic polynomial, and therefore an eigenvalue of A. I agree that the answer as written leaves that a bit DIY. – John Bollinger Jun 26 '18 at 19:41
  • 2
    @JohnBollinger The usual argument that the characteristic polynomial has the eigenvalues as roots uses precisely the fact that if $\det(M)=0$, then $M$ has a kernel, since the kernel of $A-xI$ is just the eigenvectors with eigenvalue $x$. Unless you had some other proof of that fact in mind, that argument is circular. – Milo Brandt Jun 27 '18 at 01:17
  • Sure, @MiloBrandt, but it goes the other way, too: the proposition that $det(M) = 0$ implies that $M$ has a kernel follows from the proposition that the eigenvalues of $M$ are the roots of its characteristic polynomial. These propositions are equivalent, so you have not shown any circularity. Inasmuch as the OP's proposition is a specific case of the more general result of eigenvalues as char. polynomial roots, however, if we do not take the latter as already proven, then going through it is rather roundabout. – John Bollinger Jun 27 '18 at 14:21
40

Here an elementary way:

$\det(A) = 0 \Rightarrow$ the columns of $A =(c_1 \ldots c_n)$ are linearly dependent $\Rightarrow$ there is a non-zero vector $v = (v_1 \ldots v_n)^T$ such that $v_1c_1 + \cdots v_n c_n = \vec{0} \Rightarrow Av = \vec{0} = 0\cdot v \Rightarrow 0$ is an eigenvalue of $A$.

Bernard
  • 175,478
  • 3
    It might be useful to add that v is an eigenvector of A. Basic linear algebra, I know, but appropriate given the question. – MSalters Jun 26 '18 at 13:44
16

I do not know what you know about the determinant and how you think of it, but the determinant of a square matrix $A$ is zero iff the matrix is not invertible, and that is equivalent to the kernel being non-trivial, which means that $Ax=0$ for some $x\ne0$.

Carsten S
  • 8,726
10

Since matrix $A$ over a field and det$A$ is equal to the product of eigenvalues, by using a property of the field that if $ab=0 \Rightarrow$ either $a=0$ or $b=0.$

Amit
  • 676
9

The determinant of the matrix $A$ also is the determinant of the endomorphism $\mathbb{R}^n \rightarrow \mathbb{R}^n$ (or more generally $k^n$) defined by multiplication by $A$. To say that $A$ has determinant $0$ is to say that this endomorphism is not injective.

Suzet
  • 5,482
  • 12
  • 35
  • 8
    Serious question: What level of math are you at and what level of math do you think this answer is helpful for? I ask because I remember OP's question from a sophomore undergrad class in linear algebra and assume that's where they are. – user1717828 Jun 25 '18 at 10:20
  • 5
    Well, isn't my answer quite elementary? The first time I learned about determinant was in my first year of Bachelor degree (in France), and it was first defined for endomorphisms with respect to a basis. Then, we defined it for matrices exactly the way I stated in my answer, and we deduced all the computational properties of the determinant. So to my viewpoint, I just used the very definition. Also now, the OP has been given five different answers using different notions. One of them at least (if not all of them) surely will correspond to his level of understanding. – Suzet Jun 25 '18 at 10:27
  • 5
    @Suzet That's interesting. In the US, I think it's pretty common for math students to first work with vectors, matrices, and determinants around ages 13-18. At that point the students might associate the word "function" with determining the value of $y$ given a value of $x$ or drawing a function on an $x$-$y$ graph, but not many would likely know the words "endomorphism" or "injective". – aschepler Jun 26 '18 at 01:07
  • 2
    Though I wouldn't guess this question actually falls in that category. That first visit is typically just a brief piece of a more general year introducing various ideas and tools in algebra, and would touch on things like adding, multiplying, linearity, and inverses, but possibly wouldn't get to eigenvalues, and the additional skills to prove theorems might not be expected at that point, depending. – aschepler Jun 26 '18 at 01:15
  • I see! In France, age 13-18, one will never hear about determinant nor matrices. Vectors may be introduced in a visual way only as "arrows" in a 2 or 3-dimensional space. It's only after highschool that real, serious maths is taught to to students who choose to specialize in it. In this context, $\det$ is introduced by discussing the notion of $n$-linear alternate form on a finite dimensional vector space. Eigenvalues, eigenvectors and reduction of endomorphisms/matrices is only taught in the second year of Bachelor degree. – Suzet Jun 26 '18 at 01:17
  • 1
    TBH this is the first time I've encountered the term "endomorphism". It's a pretty trivial concept to understand, but probably that also explains why I didn't encounter it. And it seems that this answer omits why a zero determinant of the endomorphism means that the endomorphism isn't injective, so this just rewords the question in terms of vector spaces. – MSalters Jun 26 '18 at 13:56
  • 1
    @user1717828 Although i think that Suzet's answer may be worded in a way that is likely a little beyond the OP, there is no reason why this excellent and succinct answer cannot stand beside the other excellent, more accessible ones here. Indeed, once a more "elementary" answer has been given, i don't think the collection of answers can be complete without reference to the powerful, elegant and simple idea that $\det:\mathbb{R}\to\mathbb{R}$ is the unique homomorphism between $GL(N, \mathbb{R})$ and $(\mathbb{R},,\times)$. That really is an essential part of a mathematician's toolkit, no? – Selene Routley Jun 27 '18 at 08:13
5

You are correct that the product of eigenvalues is the determinant, with appropriate muliplicities.
This readily follows from the Jordan normal form.

Once it's known, the problem is solved, as a product in the base field becomes zero iff one of the factors is $0$.

Berci
  • 90,745