In an optimization problem like the following
$$ \min_{x \geq 0} f(x) $$ where $f: \mathbb{R}^n \rightarrow \mathbb{R}$ is a convex function, we write the KKT condition by breaking apart the constraint $x \geq 0$ where $x=[x_1,x_2,\cdots,x_n]^T \in \mathbb{R}^n$ as $$ x_1 \geq 0 $$ $$ x_2 \geq 0 $$ $$ \vdots $$ $$ x_n \geq 0 $$ Then we associate for each of them a dual variable so we have the following $$ \nabla L(x,\mu)=\nabla f(x)-\sum_{i=1}^n\mu_i $$ where $\mu=[\mu_1,\mu_2,\cdots,\mu_n]^T$.
Now suppose we have the following
$$ \min_{0\preceq M \preceq I} f(M) $$ where $M \in \mathbb{R}^{m \times m}$ is a positive semi-definite matrix which has the set of eigenvalues and eigenvectors as $(\lambda_i(M),v_i)$.
Why KKT condition for this problem is
$$ \nabla L(M,\gamma)=\nabla f(M)+\sum_{i=1}^n\gamma_iv_iv_i^T-\sum_{i=1}^n w_iv_iv_i^T $$