Suppose we have the following optimization problem
$$ \min_{0\preceq M \preceq I} y^TMy $$ where $y \in \mathbb{R}^n$ and $M \in \mathbb{R}^{n \times n}$ is a positive semi-definite matrix. Notice that the optimization variable is matrix.
Is there any algebraically way to handle this in terms of $M$?
When we write it as the standard form we have the following
$$ \min y^TMy $$ $$ \text{s.t.}\,\,\,\,\,\, {g_1(M)=-M \preceq 0 } $$ $$ \text{s.t.}\,\,\,\,\,\, {g_2(M)=I-M \preceq 0 } $$ which is a matrix inequality. If it were vector, it was doable but what would we do when they are in matrix form.
I want to write the first order optimality condition using Lagrangian in terms of gradient of $g_1(M)$ and $g_2(M)$ but the gradient in terms of $M$ is $I$ and $-I$. The following answer explains it using different view.
What is the KKT condition for constraint $M \preceq I$?
I want a method that ties these to views together.
I want to know the general case. I want to handle it directly in terms of M.
I will appreciate, If you introduce me any reference that addresses this issue.