I have a block matrix, $M= L-I$, where $L$ takes the form, $$ L= \begin{bmatrix} 0 & L_2& \ldots & L_M \\ L_1 &0 &\ldots & L_M \\ \vdots & \vdots & \ddots & \vdots \\ L_1 & L_2& \ldots& 0 \end{bmatrix} $$ and $I$ is an appropriately sized identity matrix. The $L_m$ are all negative semi-definite with one eigenvalue $\lambda_1 = -\frac{1}{2}$ and another eigenvalue $ -\frac{1}{2}< \lambda_2 <0$ (Edit: the exact value depends on the matrix, so this one might be different for each block). Some of the $L_m$ may not be full rank and thus have some additional eigenvalues $\lambda_3 =0$ reducing the multiplicity of $\lambda_2$.
I can see numerically that the largest positive eigenvalue of $L$ is $\frac{1}{2}$ and thus the largest eigenvalue of $M$ in nonabsolute terms is $-\frac{1}{2}$.
Is there a way to prove this numerical result based on the known eigenvalues of the $L_m$ and the structure of $L$?
Thank you!
Edit: $\lambda_i$ for $i =1,2,3$ are the only eigenvalues for the $L_i$. $\lambda_3=0$ may not be an eigenvalue of all $L_i$, but those $L_i$ that have $\lambda_3$ as an eigenvalue are not invertible as some of the rows and corresponding columns are all zero.