2

If I have a equation in the form

$${\lambda ^2}{I_N} + \lambda {M_1} + {M_2} = {0_N}$$

where ${I_N}$ is the identity matrix of order $N$, $M_1$ and $M_2$ are matrices of ($N\times N$) order and $\lambda \in \mathbb C$ belongs to the complex numbers .

What are the mathematical tools or the mathematical framework to solve this kind of equations?

Stefan Hansen
  • 25,582
  • 7
  • 59
  • 91
Dan
  • 317
  • 2
    Looking at the equation for one coefficient gives you two possible values for $\lambda$. You can test both of them to see if they solve the equation. – Quimey Jan 08 '13 at 14:18

3 Answers3

2

For the equation to have a solution, your matrices $M_1$ and $M_2$ must necessarily commute. This would only be one requirement. To see this, consider the case for diagonalizable $M_2$ so that $ PM_2P^{-1} = D$ for some invertible $P$ and diagonal $D$.

\begin{align} P\left(\lambda^2 I_N + \lambda M_1 + M_2 = 0_N\right)P^{-1} \\ \lambda^2 I_N + \lambda PM_1P^{-1} + D = 0_N \end{align}

Here we can see that $M_1$ must not only have the same spectrum as $M_2$ (since otherwise the non-zero elements in the off-diagonal would not be canceled in the sum), but it must have appropriate eigenvalues such that the single $\lambda$ simultaneously solves for each diagonal term.

tl:dr: solve for $\lambda_1$ and $\lambda_2$ at any desired coordinate. Check if either one works globally. If not, then there is no solution.

adam W
  • 5,565
  • I've comproved that this eq does not have solution, this because M1 and M2 do not commute and they do not share the same spectra. Thanks a lot. – Dan Jan 10 '13 at 10:17
  • @Babak Thanks, and let me know when to delete my other comments on your question, I will look up and refresh my memory on the array syntax in the meantime – adam W Jan 16 '13 at 16:04
  • You noted me a very important thing and I want you to accept my thanks. Let them be there. Your comments contain some useful points I didn't know them. ;-) – Mikasa Jan 16 '13 at 16:08
1

If the equality is to be true, we need to check that for $ \{M_1\}_{ij} $ and $ \{M_2\}_{ij} $ we have $$ \lambda^2 \delta_{ij} + \lambda \{M_1\}_{ij} + \{M_2\}_{ij} = 0 $$

where $\delta_{ij}$ is the Kronecker delta and $ 1 \leq i,j \leq n $.

There's probably a better way of investigating it - I'm having a think about that now. Certainly, from the $i \neq j$ case, we are left with a single variabled equation, so at most we have one $\lambda $ as a solution.

Andrew D
  • 2,370
1

Indeed, by adam W's answer, $M_1$ is without loss of generality in Jordan canonical form, so that $M_1 = \bigoplus_{k=1}^M (\mu_k I_{n_k} + N_{n_k})$ for $\mu_k$ the eigenvalues (counted with the relevant notion of multiplicity) and $N_{n_k}$ the appropriate nilpotent matrices. Then $$\lambda^2 I_N + \lambda M_1 = \bigoplus_{k=1}^M ((\lambda^2 + \lambda\mu_k) I_{n_k} + \lambda N_{n_k}),$$ so that $M_2$ must necessarily have the analogous block diagonal form $$M_2 = \bigoplus_{k=1}^M (\alpha_k I_{n_k} + \beta N_{n_k})$$ for some constants $\alpha_k$ and $\beta$. Hence, when the dust settles, you're left with the system of quadratic equations $$\lambda^2 + \mu_k \lambda - \alpha_k =0$$ together with the additional equation $\lambda = \beta$ whenever $M_1$ (and hence also $M_2$) is not diagonal. I think this should be correct?