I assume you want to minimize a convex function. Let $\Bbb S_n^+ (\Bbb R)$ denote the set of $n \times n$ symmetric positive definite matrices. Rephrasing, we have function $f : \Bbb S_n^+ (\Bbb R) \times \Bbb R^n \to \Bbb R_0^+$ defined by
$$f \left( {\rm X}, {\rm y} \right) := {\rm y}^\top {\rm X}^{-1} {\rm y}$$
Introducing a new optimization variable, $z \in \Bbb R$, in epigraph form, minimizing $f$ can be written as minimizing $z$ subject to ${\rm y}^\top {\rm X}^{-1} {\rm y} \leq z$. Using the Schur complement, the inequality constraint can be rewritten as the following linear matrix inequality (LMI)
$$\begin{bmatrix} {\rm X} & {\rm y}\\ {\rm y}^\top & z\end{bmatrix} \succeq {\rm O}_{n+1}$$
which defines a spectrahedron. Thus, your (convex) optimization problem can be rewritten as a (convex) semidefinite program (SDP) that should be easy to solve in CVX.
trace_inv
function in the linked question.) – littleO Apr 04 '21 at 01:36