Questions tagged [nonlinear-optimization]

A non-linear optimization problem includes an objective function (to be minimized or maximized) and some number of equality and/or inequality constraints where the objective or some of the constraints are non-linear. Use this tag for questions related to the theory of solving such problems or for trying to solve particular problems.

A non-linear optimization problem includes an objective function (to be minimized or maximized) and some number of equality and/or inequality constraints where the objective or some of the constraints are non-linear. Usually, non-linear optimization problems are much harder to solve than linear ones.

2947 questions
8
votes
2 answers

Minimizing with Lagrange multipliers and Newton-Raphson

I am writing a program minimizing a real-valued non-linear function of around 90 real variables subject to around 30 non-linear constraints. I found handy explanation in CERN's Data Analysis BriefBook. I've implemented it and it works, but I am not…
8
votes
2 answers

Newton optimization algorithm with non-positive definite Hessian

In the newton optimization algorithm to find the local minimum $x^*$ of a non-linear function $f(x)$ with iteration sequence of $x_0 \rightarrow x_1 \rightarrow x_2 ... \rightarrow x^*$ all $\nabla ^2 f(x_k)$ should be pos. definite otherwise search…
ManiAm
  • 733
7
votes
1 answer

Solving an overdetermined system of nonlinear equations

I'm wondering what the "best" way to approach solving a system of the following form would be: $A_1X + Be^{CY} = A_2$ $A_3X + Be^{CY} = A_4$ $A_5X + Be^{CY} = A_6$ etc. EDIT: Coefficients $A_i, B, C$ are all real numbers - X and Y are unknowns. With…
MattyZ
  • 2,313
5
votes
2 answers

How to introduce Levenberg-Marquardt?

How would you introduce the Levenberg-Marquardt algorithm: To someone who understand the concept of minimisation and derivative. By using intuition instead of equation if possible. For instance a way to explain Newton, Gauss-Newton or…
Vadi
  • 155
4
votes
1 answer

How to maximize $\sum\limits_{i=1}^n u_iln(x_i)$?

How to maximize this? $$ \sum\limits_{i=1}^n u_iln(x_i), $$ where $u_i,x_i$ are real numbers, $n$ is a positive integer, $0 \leq u_i \leq 1, 0 < x_i < 1, \sum\limits_{i=1}^n u_i = 1, \sum\limits_{i=1}^n x_i = 1$, and only $x_i$ are variables, others…
Daniel
  • 569
3
votes
1 answer

Why is a tangent cone always closed?

I am taking a course on nonlinear optimization and came across this definition of a tangent cone in my lectures: Let $\emptyset \neq M \subseteq \mathbb{R}^n$ and $x \in M$. Then the tangential cone of $M$ in $x$ is defined as: $T(M,x) = \{d \in…
3
votes
2 answers

Upper bound for Maximization problem

I have an optimization problem of the form Max $x_1+x_2+x_3+\cdots+x_n$ subject to $x_0^2+x_1^2+x_2^2+\cdots+x_n^2+x_{12}^2+x_{13}^2+x_{14}^2+ \cdots+x_{1n}^2+x_{23}^2 + \cdots +x_{2n}^2+ \cdots +x_{n…
Kumar
  • 2,259
3
votes
0 answers

Optimal control. A one-dimensional dynamic process is governed by a difference equation

Optimal control. A one-dimensional dynamic process is governed by a difference equation $x(k + 1) = (x(k), u(k), k)$ with initial condition $x(0) = x0$. In this equation the value $x(k)$ is called the state at step $k$ and $u(k)$ is the control at…
Pol
  • 369
  • 3
  • 11
3
votes
1 answer

Comparison of nonlinear system solvers?

I am dealing with nonlinear systems of equations that I am trying to solve numerically. These sets of equations derive from structural mechanics involving strong nonlinearities, like contact. The size of these problems is in the order of ~10 to ~100…
Markus
  • 175
3
votes
2 answers

Algorithm for GRG2 method of solving non-linear least square

I have been looking for quite a while for an algorithm for the GRG2 method either in a .net assembly or an algorithm I could program myself, but I can't find a decent representation of the algorithm to work with. Does anyone here have any resources…
user26976
3
votes
0 answers

Solving an inverse squared sum

How would I go about solving this sum for $x$? $$\sum_i\frac{a_i}{(x+b_i)^2}=C$$ Where $\mathbf{a}$ and $\mathbf{b}$ are vectors and $C$ is a constant, and $x$ is a single number. It's for an optimisation routine so if there is no solution getting…
Timmmm
  • 244
2
votes
1 answer

Do we need steepest descent methods, when minimizing quadratic functions?

I'm studying about nonlinear programming and steepest descent methods for quadratic multivariable functions. I have a question highlighted in the following picture: My question is: If we can explicitly solve the minimizing point $\textbf{x}^*$ of…
jjepsuomi
  • 8,619
2
votes
0 answers

Proof of KKT conditions in Nocedal and Wright

I'm reading the proof of the standard result on KKT conditions in Nocedal and Wright's textbook (2nd edition). In definition 12.2, they define the tangent cone of the feasible region $\Omega$ at $x$, denoted by $T_\Omega(x)$. Generically, this cone…
Thomas
  • 199
2
votes
2 answers

How to draw a fixed length curve?

Is it possible to draw a curve with some specified length between two points? I'm considering damped sines like WolframAlpha or Bezier curves.
2
votes
1 answer

A norm in a non-linear optimization problem

Let $A \in \mathbb{R}^{m \times n}$ with $m \ge n$ and $b \in \mathbb{R}^n$. Then $x^* \in \mathbb{R}^n$ solves the problem $$min_{x \in \mathbb{R}^n} \|Ax-b\|_2 \text{ for } $$ iff $x^*$ is a solution of the linear equation system $$A^TAx =…
3nondatur
  • 4,178
1
2 3 4 5 6 7 8