I am trying to solve the non-convex problem. \begin{equation} \begin{aligned} \mathbf{P1}: &\max \limits_{x, y} U = -xy\log_2(y)-x^3y^6- xy^2 e^{xy^2},\\ {\rm{s}}.{\rm{t}}.~~& (a) 0 < x \leq X^{max}, \\ & (b) 0 < y \leq Y^{max}, \\ & (c) 0 < xy^2 \leq W^{max}, \\ \end{aligned} \end{equation} where $x,y \in \mathbb{R}$, $X^{max}, Y^{max}$, and $W^{max}$ are the maximum of $x$, $y$, and $xy^2$, respectively.
The following conclusion may be true. When $y$ is fixed, $\mathbf{P1}$ can be converted into a classical concave optimization problem in terms of $x$, which is denoted as $\mathbf{P2}$. Similarly, when $x$ is fixed, $\mathbf{P1}$ can be also converted into a classical concave optimization problem in terms of $y$, which is denoted as $\mathbf{P3}$.
Based on the conclusion, there are two methods.
(1) The first method:
A coordinate descent method is adopted to solve $\mathbf{P2}$ and $\mathbf{P3}$ alternately until the algorithm converges.
(2) The second method:
When $y$ is fixed, we can set $\frac{\partial{U}}{\partial{ x }}=0$ and obtain an implicit expression, i.e., $- y\log_2(y) - 3x^2 y^6 - y^2 e^{xy^2} - xy^4 e^{xy^2} = 0$. Based on the implicit expression, I can obtain $\hat{x}^{\star} = \hat{x}^{\star}(y)$. Combining the constraints (b) and (c) in $\mathbf{P1}$, and $\hat{x}^{\star}(y)$, we can obtain $x^{\star}(y)=\min \left\{\hat{x}^{\star}(y), X^{max}, \frac{W^{max}}{y^2} \right\}$. Substitute $x^{\star}(y)$ into the optimization problem $\mathbf{P1}$, and all $x$ is removed from the optimization problem $\mathbf{P1}$, which becomes \begin{equation} \begin{aligned} \mathbf{P4}: &\max \limits_{y} U = -x^{\star}(y) y\log_2(y)-x^{\star}(y)^3y^6- x^{\star}(y)y^2 e^{x^{\star}(y)y^2},\\ {\rm{s}}.{\rm{t}}.~~& (a) 0 < y \leq Y^{max}, \\ & (b) 0 < x^{\star}(y) y^2 \leq W^{max}. \\ \end{aligned} \end{equation} I can use a greedy algorithm to solve $\mathbf{P4}$, and obtain $y^{\star}$. Via $x^{\star}(y)=\min \left\{\hat{x}^{\star}(y), X^{max}, \frac{W^{max}}{y^2} \right\}$, we can obtain $x^{\star}(y^{\star})$.
For the above two methods, I have the following two questions. Firstly, I am not sure if these two methods are correct. Secondly, I'm not sure if the first method will converge.
Any Help will be appreciated.
Why you need a greedy algorithm?
With $\frac{\partial{U}}{\partial{ x }}=0$, I can only obtain an implicit expression about $x$ and $y$. The implicit expression makes it impossible to obtain a closed-form solution of $x$, Thus, the convexity of the objective function and constraints in $\mathbf{P4}$ may not be analyzed.
– yenfy Mar 04 '20 at 03:26