Let me expand on my earlier comment and @RodrigodeAzevedo 's reference.
Your original problem is
\begin{align*}
\min_{s_1, \dots, s_\ell, \lambda, \mathbf{K}} & \quad \mathbf{y}^T \mathbf{K}^{-1} \mathbf{y}\\
\text{subject to} & \quad \mathbf{K} = s_1 \mathbf{K}_1 + \dotsb + s_\ell \mathbf{K}_\ell + \lambda \mathbf{I}_n\\
& \quad s_1, \dots, s_\ell \geq 0\\
& \quad s_1 + \dotsb + s_\ell = 1\\
& \quad \lambda > 0\\
& \quad \mathbf{K} \succ 0\\
\end{align*}
If you just want to find a numerical solution to the problem, then CVX or CVXPY can do the job. Use the matrix_frac
function (reference here) as your objective, write out all of the above constraints, and that should solve the problem. For guidance on using CVX or CVXPY, first check their documentation. They provide plenty of examples to get you started. In case you need additional help with coding, post your question on StackOverflow.
If you want to understand how matrix_frac
works and how it relates to semidefinite programming, then here are the details.
First, write your problem in equivalent form
\begin{align*}
\min_{s_1, \dots, s_\ell, \lambda, \mathbf{K}, t} & \quad t\\
\text{subject to} & \quad \mathbf{y}^T \mathbf{K}^{-1} \mathbf{y} \leq t\\
& \quad \mathbf{K} = s_1 \mathbf{K}_1 + \dotsb + s_\ell \mathbf{K}_\ell + \lambda \mathbf{I}_n\\
& \quad s_1, \dots, s_\ell \geq 0\\
& \quad s_1 + \dotsb + s_\ell = 1\\
& \quad \lambda > 0\\
& \quad \mathbf{K} \succ 0\\
\end{align*}
We want to turn the constraints $\mathbf{y}^T \mathbf{K}^{-1} \mathbf{y} \leq t$, $\mathbf{K} \succ 0$ into a condition involving some matrix being positive semidefinite. The key is Schur complement: If you have a symmetric block matrix $X = \begin{pmatrix}
A & B^T\\
B & C
\end{pmatrix}$ and $C \succ 0$, then $X \succeq 0$ if and only if $A - B^T C^{-1} B \succeq 0$.
Using Schur complement, we find that the constraints are equivalent to $\begin{pmatrix}
t & \mathbf{y}^T\\
\mathbf{y} & \mathbf{K}
\end{pmatrix} \succeq 0$, $\mathbf{K} \succ 0$.
Now your problem can be expressed as
\begin{align*}
\min_{s_1, \dots, s_\ell, \lambda, \mathbf{K}, t} & \quad t\\
\text{subject to} & \quad \begin{pmatrix}
t & \mathbf{y}^T\\
\mathbf{y} & \mathbf{K}
\end{pmatrix} \succeq 0\\
& \quad \mathbf{K} = s_1 \mathbf{K}_1 + \dotsb + s_\ell \mathbf{K}_\ell + \lambda \mathbf{I}_n\\
& \quad s_1, \dots, s_\ell \geq 0\\
& \quad s_1 + \dotsb + s_\ell = 1\\
& \quad \lambda > 0\\
& \quad \mathbf{K} \succ 0\\
\end{align*}
This is a bona fide semidefinite program: You just have a linear objective, linear equality and inequality constraints, and semidefinite matrix constraints.
Note that for implementation purposes, you don't need to manually express your problem in this last form. matrix_frac
automatically does this reduction for you behind the scene.