Your problem is not convex, so you will need some relaxation or iterative schemes to solve it (locally). Another issue is that $X$ is not symmetric, so many existing schemes will not apply. So, the first step would be to remove the annoying terms in the cost to optimize. To do so, we define
$$M=X^{-T}SX^{-1},$$
which can be non-conservatively relaxed into $M\succeq X^{-T}SX^{-1}$, as it is a minimization problem. Now, we get that this equivalent to
$$\begin{bmatrix}M & X^{-T}\\X^{-1} & S^{-1}\end{bmatrix}\succeq0.$$
Now let $Y=X^{-1}$ to yield the optimization problem
$$\begin{cases}\underset{(X,Y)}{\operatorname{min}} &\operatorname{tr}\left(M\right) + \lambda ||X||_1\\
\text{s.t.}&XY=I,\\& \begin{bmatrix}M & Y^{T}\\Y & S^{-1}\end{bmatrix}\succeq0
\end{cases}.$$
Now the difficulty is how to deal with the nonlinear constraint $XY=I$. One way is to consider an iterative algorithm where we update the values of $X$ and $Y$ according some perturbations $\delta X$ and $\delta Y$ as
- Pick $X_0,Y_0$ such that $X_0Y_0=I$ and let $i=0$.
- Then solve the optimization problem
$$\begin{array}{rcrl}
(\delta X_i,\delta Y_i)&=&\operatorname{argmin}_{\delta_X,\delta_Y} & \operatorname{tr}\left(M\right) + \lambda ||X_i+\delta_X||_1\\
%
&&\text{s.t.} & X_i\delta_Y+\delta_X Y_i=0,\\
&&& \begin{bmatrix}M & Y^{T}+\delta_Y^T\\Y+\delta_Y & S^{-1}\end{bmatrix}\succeq0
\end{array}$$
- Let $X_{i+1}=X_i+\delta X_i$ and $Y_{i+1}=Y_i+\delta Y_i$
- Evaluate $X_{i+1}Y_{i+1}$ and correct the values if necessary.
- Let $i=1+1$, and go back to step 2.
A stopping criterion can be implemented by stopping when the cost does not decrease anymore. Additional constraints may also be added on the norm of $\delta_X$ and $\delta_Y$ to limit the size and steps and avoid deviating too much from the manifold $XY=I$.
The 1-norm term can be removed from the cost using lifting variables as done in José C Ferreira's answer.
The convergence can only be ensured locally, so you may need to consider restarting the algorithm.