1

I am not currently unfamiliar with a numerical optimization, so I am studying them. What I am wondering is that I'd like to optimize a certain function with the following constraints by using gradient descent algorithm.

\begin{align} & \min\limits_{x}f(x) \\ & \text{subject to }\sum_{i} x_{i}=1 \quad \text{and} \quad x_{i} \geq 0 \end{align}

where the function $f$ is non-convex function. Is there any method to do optimization efficiently?

JH Back
  • 11
  • Welcome to MSE! Please show your attempts. – Culver Kwan Aug 17 '19 at 07:18
  • 2
    You can use the projected gradient method. At each iteration, you will have to project onto the probability simplex, but it is possible to do that efficiently. You can also use an accelerated version of the projected gradient method, such as FISTA. – littleO Aug 17 '19 at 07:20
  • What is your $f$. At a point it is easy to parametrize the directions preserving the constraints. – reuns Aug 17 '19 at 08:46
  • Please, add more context. Why SD method? Is the dimension of $x$ is so high that using quasi-Newton is expensive? Do you know anything more about $f$ except "non-convex"? – A.Γ. Aug 17 '19 at 08:52
  • @littleO Thanks for good directions. – JH Back Aug 19 '19 at 02:42
  • @reuns The function $f$ is currently unknown except non-convex function. – JH Back Aug 19 '19 at 02:46
  • @A.Γ. Only thing I know is non-convex function and the dimension of $x$ is quite high. So, I am considering some solutions generally. SD method is not necessary. – JH Back Aug 19 '19 at 02:46
  • This post addresses your question – greg Oct 21 '23 at 18:26

1 Answers1

1

You can transform it to a Lagrangian function and minimize this function with the algorithm. Or find the first and/or second derivative and go for bisection or newton methods.