$M$ can be diagonalized iff the minimal polynomial $m$ for $M$ splits completely into linear, non-repeated factors $m(\lambda)=(\lambda-\lambda_1)(\lambda-\lambda_2)(\cdots)(\lambda-\lambda_N)$. The usual proof of this involves the unique ($N-1$)-st order polynomials $p_{k}$ such that that $p_{k}(\lambda_{j})=\delta_{j,k}$. Then $\sum_{k=1}^{N}p_{k}\equiv 1$ because the sum is an $(N-1)$-st order polynomial which is $1$ in $N$ places. Therefore,
$$
I = p_1(M)+p_2(M)+\cdots+p_N(M).
$$
Furthermore $p_j(M)p_k(M)=0$ for $j \ne k$ because $m$ divides $p_j p_k$ for $j \ne k$. Therefore each $p_j(M)$ is a projection matrix; to see this, apply $p_j(M)$ to the above identity:
$$
p_j(M)=p_j(M)^{2}.
$$
Furthermore $(M-\lambda_k I)p_k(M)=0$ which implies
$$
M = \lambda_1 p_1(M)+\lambda_2 p_2(M)+\cdots+ \lambda_N p_N(M).
$$
If $M_1,M_2,M_3,\cdots,M_J$ are commuting diagonalizable matrices, you can perform the above construction for each $M_j$ in order to obtain eigenvalues $\lambda_{j,1},\lambda_{j,2},\cdots,\lambda_{j,K_{j}}$ and polynomials $p_{j,1},p_{j,2},\cdots,p_{j,K_j}$ for each $1 \le j \le J$. Because the $M_j$ commute, then the same is true of all of the $p_{j,k}(M_j)$. Now form all of the distinct products
$$
P_{k_1,k_2,\cdots,k_J}=p_{1,k_1}(M_1)p_{2,k_2}(M_2)\cdots p_{L,k_J}(M_J).
$$
The sum of all such products is $I$, and every such $P$ is a projection. Discard the products that turn out to $0$. Because the order of the factors may be rearranged without changing $P$, it follows that
$$
(M_{j}-\lambda_{j,k_{j}}I)P_{k_1,k_2,\cdots,k_J}=0,\;\;\; 1 \le j \le J.
$$
So there are non-zero projections $Q_{1},Q_{2},\cdots,Q_{m}$ whose sum is $I$, whose products are $0$ for distinct factors, and such that every $M_{j}$ is a scalar multiple of the identity on the range of a given $Q_{k}$. Choose a basis for each of the ranges of $Q_{j}$. Combining these bases produces a basis with respect to which each $M_j$ has a diagonal representation.