7

Let $A\in \mathbb{n\times n}$ be a symmetric matrix. Let $x\in \mathbb{R}^{n\times 1}$ be an unknown vector.

The problem is $$\min \limits_x x^TAx.$$

Since $A$ is an input, I am not sure

1 it is positive semidefinite (the objective is convex);

2 or it is negative semidefinite (the objective is concave);

3 or indefinite (the objective is neither concave nor convex. )

Case 1 is simple. In Case 2, the minimum is infinite. So people may say it is not well defined. Could any one tell me how should I handle case 3? Is its minimum infinite?

If I change the problem to,

$$\min \limits_x x^TAx$$

where $x\in C$, $C$ is a convex set. For example $\sum\limits_i^n x_i=1$; or $|x_i|\le 1$. Is there any way to find $\arg\min\limits_x x^TAx$?

user18481
  • 151
  • 1
    Yes, You can use a Lagrange approach and try to compute the derivative of the Lagrange function to obtain possible minima... – Alex Jul 04 '14 at 12:16
  • @Alex, thank you for your comment. But if it is not convex, will the solution be global? – user18481 Jul 04 '14 at 12:50
  • 1
    You optimize over a bounded set and thus, one of the KKT points should give a minimum. Now you can check, if there are other possible minima... – Alex Jul 04 '14 at 19:07

2 Answers2

3

Suppose that there is some $v$ such that $v^TAv < 0$, i.e. $A$ is not positive semidefinite. Then for every $\lambda > 0$, we have $$(\lambda v)^TA(\lambda v) = \lambda^2 (v^T Av) \ \overset{\lambda \to \infty}{\longrightarrow}\ -\infty$$ It follows that if $A$ is not positive semidefinite the problem is unbounded from below. This is in particular true for indefinite matrices.

Surb
  • 55,662
  • 1
    Thank you for your answer. But if I add a constrain such that $|v_i|\le 1$, how could I find the $v^*$? – user18481 Jul 04 '14 at 12:05
  • 2
    @user18481 If $|v_i| \leq 1$, then you can easily observe that $$\left| v^TAv \right|\leq \sum_{i,j=1}^n |A_{i,j}v_iv_j|\leq \sum_{i,j=1}^n |A_{i,j}|$$ and so the constrained problem is always bounded no matter what is your matrix. – Surb Jul 04 '14 at 12:39
1

Lets deal with the case of $\min \limits_{\boldsymbol{x} \in C} \{ ~\boldsymbol{x}^T A \boldsymbol{x}~ \}$ for $C = \{~ \boldsymbol{x} \mid \boldsymbol{x}^T \boldsymbol{x} = \sum_{j=1}^n x_j^2 = 1~\}$ .

Since $A$ is symmetric, it can be diagonalized with orthonormal eigenvectors $\boldsymbol{a}_j$ (thus $\boldsymbol{a}_i \cdot \boldsymbol{a}_j = \delta_{ij}$ ) having corresponding eigenvalues $\lambda_j$ . Let those be ordered $$ \lambda_1 \leq \lambda_2 \leq~ ... ~\leq \lambda_n $$ and expand $\boldsymbol{x}$ into a sum of eigenvectors $$ \boldsymbol{x} = \sum_{j=1}^n x_j \boldsymbol{a}_j $$ Choose $c \in \mathbb{R}$ such that $\lambda_1 + c > 1$ , then \begin{align} \boldsymbol{x}^T A \boldsymbol{x} + c &= \boldsymbol{x}^T A \boldsymbol{x} + c ~\boldsymbol{x}^T \boldsymbol{x} = \sum_{j=1}^n x_j^2 (\lambda_j + c ) \\&= (\lambda_1 + c) \sum_{j=1}^n x_j^2 ~\underbrace{\frac{\lambda_j + c}{\lambda_1 + c } }_{1 ~\geq} \geq (\lambda_1 + c) \underbrace{ \sum_{j=1}^n x_j^2 }_{=1} \\&= \lambda_1 + c \end{align} This implies $$ \min \limits_{\boldsymbol{x} \in C} \{ ~\boldsymbol{x}^T A \boldsymbol{x} ~\} \geq \lambda_1 $$ It is easy to check that $\boldsymbol{a}_1^T A \boldsymbol{a}_1 = \lambda_1$ , so we actually have $$ \min \limits_{\boldsymbol{x} \in C} \{ ~\boldsymbol{x}^T A \boldsymbol{x} ~\} = \lambda_1 $$

Léreau
  • 3,015