Questions tagged [diagonalization]

For questions about matrix diagonalization. Diagonalization is the process of finding a corresponding diagonal matrix for a diagonalizable matrix or linear map. This tag is NOT for diagonalization arguments common to logic and set theory.

A square matrix $A$ is diagonalisable if there is an invertible matrix $P$ such that $P^{-1}AP$ is a diagonal matrix. One can view $P$ as a change of basis matrix so that, if $A$ is viewed as the standard matrix of a linear map $T$ from a vector space to itself in some basis, it is equivalent to say there exists an ordered basis such that the standard matrix of $T$ is diagonal. Diagonal matrices present the eigenvalues of the corresponding linear transformation along its diagonal. A square matrix that is not diagonalizable is called defective.

Not every matrix is diagonalisable over $\mathbb{R}$ (i.e. only allowing real matrices $P$). For example, $$\begin{pmatrix} 0 & -1 \\ 1 & 0 \end{pmatrix}$$

Diagonalization can be used to compute the powers of a matrix $A$ efficiently, provided the matrix is diagonalizable.

Diagonalization Procedure :

Let $A$ be the $n×n$ matrix that you want to diagonalize (if possible).

  • Find the characteristic polynomial $p(t)$ of $A$.
  • Find eigenvalues $λ$ of the matrix $A$ and their algebraic multiplicities from the characteristic polynomial $p(t)$.
  • For each eigenvalue $λ$ of $A$, find a basis of the eigenspace $E_λ$. If there is an eigenvalue $λ$ such that the geometric multiplicity of $λ$, $dim(E_λ)$, is less than the algebraic multiplicity of $λ$, then the matrix $A$ is not diagonalizable. If not, $A$ is diagonalizable, and proceed to the next step.

  • If we combine all basis vectors for all eigenspaces, we obtained $n$ linearly independent eigenvectors $v_1,v_2,…,v_n$.

  • Define the nonsingular matrix $$P=[v_1\quad v_2\quad …\quad v_n]$$
  • Define the diagonal matrix $D$, whose $(i,i)$-entry is the eigenvalue $λ$ such that the $i^{th}$ column vector $v_i$ is in the eigenspace $E_λ$.
  • Then the matrix A is diagonalized as $$P^{−1}AP=D$$

References:

Diagonal Matrix on Wikipedia

Matrix Diagonalization on Wolfram MathWorld

2484 questions
6
votes
2 answers

What is the computational complexity for diagonalizing covariance matrix

I am considering using an algorithm called Covariance Matrix Adaptation Evolution Strategy (CMA-ES) for global optimization. Part of the algorithm involves taking a square root of a covariance matrix, which requires that the matrix be diagonalized.…
null83
  • 61
5
votes
1 answer

Diagonalising a matrix comprising of blocks of diagonal matrices

I would like to diagonalise a matrix $M$ of the form $$ M = \left(\begin{matrix} M_{11}&\ldots&M_{1n}\\ \vdots&\ddots&\vdots\\ M_{n1}&\ldots&M_{nn} \end{matrix}\right), $$ where each element $M_{ij}$ is a square diagonal matrix itself and, in…
3
votes
2 answers

Bijection invalidating cantor's diagonalization argument?

I am a software engineer without a math degree, so I am planning to learn something today. Take this bijection between the naturals and reals. (This is a valid bijection, no?) ...03020 => 0.02030... ...11111 => 0.11111... ...51413 =>…
3
votes
1 answer

Which one is more? diagonalizable matrix or non-diagonalizable matrix?

Are there more diagonalizable matrixes or non-diagonalizable matrixes?
3
votes
3 answers

Proof: If $P^2=P$, then $P$ is diagonalizable

$\textbf{Definition:}$ Let $P$ be a matrix. We say is diagonalizable matrix iff $\exists$ an invertible matrix $A$: $\exists$ a diagonal matrix $D$: $A^{-1}PA=D$. I asked a question earlier as to how to prove the following proposition. I am on a…
W. G.
  • 1,766
2
votes
2 answers

Matrix diagonalization and operators

Let $V=\Bbb F^{m\times n}$ $T: V\to V$ , by $T(B)=P^{-1}BP$ , for any $B$ in $V$ , where $P$ is an invertible matrix. prove that if $A$ is an eigenvector of $T$, with eigenvalue $\lambda$ and $A$ is a diagonalizable matrix, then $\lambda=1$. I know…
Shirly
  • 279
  • 2
  • 3
2
votes
1 answer

If A is row vector 1×n then is $A^tA$ always diagonizable?

If $A$ is row vector 1×n then is $$A^tA$$ always diagonizable? Multiplication of $A$ transpose times $A$ gives a matrix of $n×n$ and i was able to prove that this matrix is diagonizable over $\mathbb R$ but i cant prove it over finite fields or…
2
votes
3 answers

Proving the following matrix is diagonalizable

I'm asked to prove that the matrix $A\in M_{n}(\mathbb C)$ that satisfy $A^8+A^2=I$ is diagonalizable. I've tried looking at the equation $x^8+x^2-1=0$ and determining whether $M_A$ has any repeating roots, but this got me nowhere. Afterwards, I…
Math101
  • 4,568
  • 6
  • 16
2
votes
1 answer

Diagonalise a sparse (symmetric) matrix with elements only on some diagonals

Is there an analytical way or a good approximation or any other mathematical method to diagonalise a sparse (symmetric) matrix with elements only onsome diagonals? For example $$ \begin{bmatrix} B & 0 & 0 & A & 0\\ 0 & B & 0 & 0 & A\\ 0 & 0 & B & 0…
2
votes
1 answer

Is A diagonalizable over $\mathbb{C}$?

Consider the matrix $$A=\left(\begin{array}{cccc} 1 &1 &-3 &0 \\ -1 &-1 &1 &-2 \\ -1 &-1 &-1 &-2 \\ 1 &1 &2 &3 \\ \end{array}\right)$$ The characteristic polynomial of $A$ is $\chi _A(x)=x^2(x-1)^2.$ Is $A$ diagonalizable over…
1
vote
0 answers

Rational canonical forms of a matrix

Let $A$ a $4\times 4$ matrix over the reals with $A^3-A^2+A=0$ That means that its minimal polynomial is: $x$ or $x^2-x+1$ or $x^3-x^2+x$ In the first case $A=0$. The second case is…
Kal S.
  • 3,781
1
vote
2 answers

Proving that $\dim(\mathrm{span}({I_n,A,A^2,...})) \leq n$

Let $A$ be an $n\times n$ matrix. Prove that $\dim(\mathrm{span}({I_n,A,A^2,...})) ≤ n$ I'm at a total loss here... Can someone help me get started?
1
vote
2 answers

How to find the values of a triangle matrix if it diagonalizable

If this matrix is diagonalizable then what are the values of $a_1$ through $a_6$? $$\begin{bmatrix} 3&a_1&a_2&a_3\\ 0&3&a_4&a_5\\ 0&0&3&a_6\\ 0&0&0&3 \end{bmatrix} $$ I understand that the eigenvalues are $3$, but how do you solve for the variables?
1
vote
1 answer

Diagonalization of complex Hermitian matrices

I thought, S is self-adjoint because S^∗ = (A ^∗A) = A ^∗ (A^∗ ) ^∗ = A^∗A = S. Therefore, spectral theorem ⇒ there is an orthonormal basis {e1, ..., en} of eigenvectors of S. But I guess I'm far from the solution. I would be very happy if you have…
A O
  • 51
1
vote
1 answer

Reversing matrix diagonalisation

Somewhere in a proof I used the fact that for a vector $a\in\mathbb R^n$, the matrix $a a^T$ is symmetric and can thus be diagonalised into $a a^T = Q^T D Q$, with $Q$ orthogonal and $D$ diagonal. I needed this to prove that, given $x\in \mathbb…
1
2 3