A non-convex optimization problem is one where either the objective function is non-convex in a minimization problem (or non-concave in a maximization problem) or where the feasible region is not convex.
Questions tagged [non-convex-optimization]
656 questions
1
vote
1 answer
Nonconvex programming through DCA or CCCP
I have an optimization problem in the form of:
$\min_x \quad u^Tx \\ s.t. \quad (x-\textbf{1/2})^2 - \vert x-\textbf{1/2}\vert \leq \textbf{-1/4}$.
Since the constrained of this problem is not convex, the standard convex programming methods cannot…
1
vote
0 answers
I'm not sure if the coordinate descent method for solving this non-convex problem will converge
I am trying to solve the non-convex problem.
\begin{equation}
\begin{aligned}
\mathbf{P1}: &\max \limits_{x, y} U = -xy\log_2(y)-x^3y^6- xy^2 e^{xy^2},\\
{\rm{s}}.{\rm{t}}.~~& (a) 0 < x \leq X^{max}, \\
& (b) 0 < y \leq Y^{max}, \\
& (c) 0 < xy^2…

yenfy
- 11
0
votes
1 answer
Question about expectation of v_t and the true second moment g_t^2 in the Adam algorithm
The paper is ADAM: A METHOD FOR STOCHASTIC OPTIMIZATION
When we try to prove an equation, how can we say (2) jumps to (3). I think they have a large gap since we should consider $g_{t-1}$, $g_{t-2}$ etc, and put $E{[g_{t}^2]}$ in (4) is very…

skytree
- 101
0
votes
0 answers
How the ensure the critical point of nonconvex function is isolated?
Let $f: \mathbf{R^n} \to \mathbf{R}$ be differentiable, nonconvex function. The gradient descent iteration is given by
$$x^{k+1} = x^{k} - t_k \nabla f(x^k)$$
To ensure $\lim_{k \to \infty} x^k$ exists, one condition for $f$ is that the critical…

Ronglong Fang
- 1
- 1
0
votes
1 answer
Simple non convex optimization
I am trying to solve the following optimization problem. I'd appreciate any tips or directions.
$ \text{minimize } |x|^2 + |y|^2$
$ \text{subject to } |x-y|^2 \geq 1$
where $|.|$ is the absolute value, and $x$ and $y$ are two complex…

dsp_guy2020
- 73
0
votes
0 answers
Non-convex optimization problem
I am trying to solve the following non-convex optimization problem. I'd appreciate any tips or directions.
$ \text{minimize } ||C_1||_2^2 + ||C_2||_2^2$
$ \text{subject to } ||H(C_1 -C_2)||_2^2 \geq 1$
where $||.||_2$ is the $l2$ norm, $C_1$ and…

dsp_guy2020
- 73
0
votes
1 answer
Non Convex Optimization update analytic proof
I need help with the below proof,
$$A = \underset{A}{\text{argmin}}(\frac{1}{2} ||X_{(1)} - A(C \odot B)^T||^2_F + ||\Lambda \boxdot (A - \tilde{A})||_F^2 + \frac{\rho}{2} ||A - \tilde{A}||_F^2) \\ = (X_{(1)}(C \odot B) + \rho \tilde{A} - \Lambda )…

Mour_Ka
- 316