2

I am trying to perform an estimation of $x$, such that $\|Ax-b\|^2$ is minimized, subject to the constraint on the eigenvalues $\max(|\lambda(x)|) \leq 1$. I was wondering the following:

  1. What would be a sufficient condition for this to be true?
  2. What would be a necessary condition?
  3. Is there a way to implement this using convex optimization?

I am talking about the magnitude of the eigenvalues. This is equivalent to saying that all eigenvalues lie within the unit disc (on real/imaginary plane).

Description of the Variables:

A is a NxN matrix x is a Nx1 vector b is a Nx1 vector

However, this is because my problem is vectorized, so I solve it in this fashion. My final data estimate is x reshaped into an mxm matrix, where m*m = N. So I want this x_reshaped matrix to have eigenvalues less then 1.

ajl123
  • 175
  • 1
    I guess $X$ is a symmetric matrix? Ever heard about semidefinite programming (sdp)? – user251257 Oct 12 '16 at 21:00
  • Well everything is going to be vectorized first, and then reshaped into a matrix, but not necessarily symmetric. – ajl123 Oct 12 '16 at 21:13
  • $x$ need not have a real eigenvalue at all – user251257 Oct 12 '16 at 21:15
  • 1
    I edited the problem description to reflect more information on the eigenvalues.

    I am interested in the maximum of the eigenvalues magnitude being <= 1, so that they are constrained within a unit disc.

    – ajl123 Oct 12 '16 at 21:20
  • 2
    What is an eigenvalue of a vector ???? – Ahmad Bazzi Oct 12 '16 at 21:56
  • 1
    I am puzzled as @El Bazzi : are $x$ and $b$ $n \times n$ matrices. Thus what definition do you take of the norm of a matrix (spectral norm ?). – Jean Marie Oct 12 '16 at 22:44
  • @user251257 Even if $\mathrm X$ were symmetric, we would still have a quadratic objective function. How could one use SDP, then? – Rodrigo de Azevedo Oct 13 '16 at 17:11
  • @RodrigodeAzevedo it isn't a sdp itself. I just want to know the context. You can for example linearize it ... – user251257 Oct 13 '16 at 17:17
  • @ElBazzi I updated the description to explain what I am asking. It is not eigenvalue of a vector, but the eigenvalue of the matrix form of that vector (reshaped) – ajl123 Oct 13 '16 at 17:18
  • @ajl123 If the matrix were symmetric, the problem would be much easier. – Rodrigo de Azevedo Oct 13 '16 at 18:03
  • @RodrigodeAzevedo can you outline a solution if the matrix were symmetric? I could solve that first for my problem as sort of a first pass analysis. – ajl123 Oct 13 '16 at 18:05
  • 1
    @ajl123 If the matrix is symmetric, then the eigenvalues are real and one can use the spectral norm instead of the spectral radius. Take a look at this question. – Rodrigo de Azevedo Oct 13 '16 at 18:07

1 Answers1

2

If I understand the question correctly, we have the constrained least-squares problem

$$\begin{array}{ll} \text{minimize} & \|\mathrm A \mathrm X - \mathrm B\|_F^2\\ \text{subject to} & \rho(\mathrm X) \leq 1\end{array}$$

where $\mathrm A, \mathrm B \in \mathbb{R}^{m \times n}$ are given and $\rho (\cdot)$ denotes the spectral radius. Using a strict inequality instead

$$\begin{array}{ll} \text{minimize} & \|\mathrm A \mathrm X - \mathrm B\|_F^2\\ \text{subject to} & \rho(\mathrm X) < 1\end{array}$$

If $\rho(\mathrm X) < 1$, then the origin of the discrete-time linear dynamical system $\eta_{k+1} = \mathrm X \eta_{k}$ is globally asymptotically stable (GAS). Let $V (\eta) := \eta^T \mathrm P \eta$ be a Lyapunov function, where $\mathrm P \succ \mathrm O_n$ is to be determined. Hence,

$$(\forall \eta \neq 0_n) (V (\mathrm X \eta) - V (\eta) < 0) \Longleftrightarrow (\forall \eta \neq 0_n) (\eta^T (\mathrm X^T \mathrm P \mathrm X - \mathrm P) \, \eta < 0) \Longleftrightarrow \mathrm X^T \mathrm P \mathrm X - \mathrm P \prec \mathrm O_n$$

where the matrix inequality $\mathrm X^T \mathrm P \mathrm X - \mathrm P \prec \mathrm O_n$ can be rewritten as $\mathrm P - \mathrm X^T \mathrm P \mathrm X \succ \mathrm O_n$. Thus, we have the following optimization problem in $\mathrm X$ and $\mathrm P$

$$\begin{array}{ll} \text{minimize} & \|\mathrm A \mathrm X - \mathrm B\|_F^2\\ \text{subject to} & \mathrm P \succ \mathrm O_n\\ & \mathrm P - \mathrm X^T \mathrm P \mathrm X \succ \mathrm O_n\end{array}$$

Note that $\mathrm P - \mathrm X^T \mathrm P \mathrm X \succ \mathrm O_n$ is not a linear matrix inequality (LMI). How can one solve this?

  • I think it is $A\operatorname{vec}(X)-B$ or rather $\mathcal AX -B$ where $\mathcal A$ is a linear map on the matrix space. – user251257 Oct 13 '16 at 17:37
  • Since the matrix is square, is there a relation between the eigenvalues and the singular values of the matrix? If so, could we exploit that? – ajl123 Oct 13 '16 at 21:47
  • @ajl123 The eigenvalues can be complex. The singular values are always real and nonnegative. If the matrix is symmetric and positive semidefinite (or definite), then the eigenvalues and the singular values are the same. – Rodrigo de Azevedo Oct 13 '16 at 21:54