1

I got this matrice here, I have to check if it's invertible, do I need to use a Laplace method for this? For a simple 2×2 matrice which looks like \begin{bmatrix}a&b\\c&d\end{bmatrix} it would be ad - bc, if it doesn't equal 0 then it's invertible. But how do I approach the bigger matrices?

\begin{bmatrix}0&1&1&1\\1&0&1&1\\1&1&0&1\\1&1&1&0\end{bmatrix}

Bernard
  • 175,478
etoRatio
  • 247
  • 1
    You can use your favorite method to calculate the determinante, or you can just try to calculate the inverse. Eventually you will have to calculate the inverse anyways, so this might spare you calculating the determinant... It is still helpful to know if the inverse exists in the first place... – Cornman Jul 25 '21 at 20:18
  • In my question it says, that I have to check it if it's invertible first, then proceed with solving the rest. So I guess I have to – etoRatio Jul 25 '21 at 20:20
  • Well, then use the Laplace method, or any other method you know. I would go with Laplace. – Cornman Jul 25 '21 at 20:20

3 Answers3

1

As mentioned, use Laplace method for hand calculating determinants for small matrices.

https://en.m.wikipedia.org/wiki/Laplace_expansion

Here you see we can ignore the first term since its entry is zero.

Annika
  • 6,873
  • 1
  • 9
  • 20
1

Computing the $4 \times 4$ matrix would certainly work, but for large enough matrices (and I would consider $4 \times 4$ just large enough), it is quicker to row reduce the matrix. The matrix is invertible if and only if you can row reduce the matrix down to an upper triangular matrix (row-echelon form) with non-zero entries on the diagonal. In other words, there is a pivot in each column.


As an aside that you don't need to understand at the moment, with a bit more of an advanced eye, you can quickly see that this matrix is indeed invertible, as it is $J - I$, where $J$ is the matrix of $1$s. The matrix of $1$s has eigenvalues $0$ and $4$ ($0$ and $n$ more generally, where we are working with $n \times n$ matrices), see here. This means the eigenvalues of $J - I$ are $-1$ and $3$, which doesn't include $0$, the matrix is invertible.

Theo Bendit
  • 50,900
0

The columns of you matrix are linearly independent (easy enough to check with the definition of linear independence). Any matrix with linearly independent columns in injective, and every square injective matrix is also surjective, hence invertible (follows from rank-nullity theorem).

Alternatively, if $J$ is the four by four matrix with all entries $1$, then your matrix is $J-I$, where $I$ is the four by four identity matrix. The eigenvalues of $J$ are $4$ once and $0$ three times, so the eigenvalues of $J-I$ are $3$ once and $-1$ three times. Since the determinant of a matrix is the product of its eigenvalues, then $\det(J-I)=3(-1)^3=-3\neq 0$ hence $J-I$ is invertible.

C Squared
  • 3,648
  • 1
  • 9
  • 32