4

I am studying control systems, and I want to solve following problem.

Given full rank state matrix $A$ (with all unstable eigenvalues), design input matrix $B$, such that cost function $J = \operatorname{trace}(B'XB)$ is minimized, where $X$ is the solution to discrete-time Ricatti equation (DARE). I have contraint that $(A,B)$ is stabilizable, i.e.

For a given full rank $A \in \mathbb{R}^{n \times n}$, with $\lambda_i(A)>1$, solve the following

\begin{array}{ll} \underset{X\in \mathbb{R}^{n\times n},B \in \mathbb{R}^{n\times m}}{\text{minimize}} & \mathrm{tr} \left( B' X B \right)\\ \text{subject to} & X=A'X(I+BB'X)^{-1}A\\&(A,B)\text{ is stabilizable}\end{array}

From my understanding, since all eigenvalues of $A$ are outside of unit circle (discrete-time system), we can change condition $(A,B)\text{ is stabilizable}$ with $(A,B)\text{ is controllable}$, which is equivalent to $\mathrm{rank}([B\quad AB\quad A^2B\quad \ldots\quad A^{n-1}B])=n$.

The problem is for sure feasible, since for any full rank $A$, there is $B$ such that rank condition is satisfied and we can solve DARE.

Novice
  • 1,127
Lee
  • 1,910
  • 12
  • 19
  • Regarding question 1, here is what that theorem looks like for the specific case that $B$ has only one column – Ben Grossmann Jun 29 '20 at 16:09
  • It is notable that, in some sense, "most" pairs $(A,B)$ are controllable. That is, if you change the entries of $A$ and/or $B$ slightly, you will "probably" end up with a controllable pair. – Ben Grossmann Jun 29 '20 at 16:10
  • You could look at the Hautus lemma, which essentially comes down to that the span of the columns of $B$ have a non-zero contribution from each of the eigenvectors of $A$. Also, is your expression for $X$ after "subject to" the DARE, because the expression you used doesn't seem to be completely correct. – Kwin van der Veen Jun 29 '20 at 23:53
  • @KwinvanderVeen Regarding the Hautus Lemma, lets say I have $A_1=\begin{bmatrix} 2 & 1 & 0\ 0 & 2 & 0\ 0&0&2 \end{bmatrix}$ and $A_2=\begin{bmatrix} 2 & 0& 0\ 0 & 2 & 0\ 0&0&2 \end{bmatrix}$. We have $\mathrm{rank}[2I-A_2,B]=3$, so $B\in\mathbb{R}^{3\times 3}$ (so using this we conclude that to control $A_2$ we need at least 3 inputs?), is it because $A_2$ has 3 linearly independent eigenvectors? Then from $\mathrm{rank}[2I-A_1,B]=3$, so $B\in\mathbb{R}^{2\times 2}$ (to control $A_1$ we need at least 2 inputs?), is it because $A_1$ has 2 linearly independent eigenvectors? – Lee Jun 30 '20 at 07:35
  • @KwinvanderVeen I have edited the Dare condition, thanks – Lee Jun 30 '20 at 07:36
  • @Omnomnomnom if I understand correctly, for special case that $B$ has only one column, we require $A$ to be cyclic (monic polynomial equal to characteristic polynomial), which is single-input control. – Lee Jun 30 '20 at 07:38
  • @Lee I assume you mean "minimal polynomial" rather than monic. If so, then that's correct. More generally, the minimum required number of inputs for a given $A$ is the number of blocks in the Frobenius normal form of $A$. Keep in mind, however, that "most" matrices $A$ have no repeating eigenvalues and are therefore cyclic. – Ben Grossmann Jun 30 '20 at 07:42
  • @Omnomnomnom yes, minimal polynomial – Lee Jun 30 '20 at 09:21

1 Answers1

1

I tried to use dual problem, maybe someone can help me to finish it. By creating random $B$, usually we get stabilizable pair $(A,B)$, so lets ignore the second constraint for now.

$\operatorname{trace}(B'XB)=\operatorname{trace}(BB'X)=\operatorname{trace}(AX^{-1}A'X)-\operatorname{trace}(I)$, so we can minimize $\operatorname{trace}(AX^{-1}A'X)$ instead of $\operatorname{trace}(B'XB)$.

Rewrite $X=A'X(I+BB'X)^{-1}A$ to $BB'-AX^{-1}A'+X^{-1}=0$, then Lagrangian function:

\begin{align} &\Lambda(B,X,V)=\operatorname{trace}(AX^{-1}A'X)+\operatorname{trace}(V'BB')-\operatorname{trace}(V'AX^{-1}A')+\operatorname{trace}(V'X^{-1}),\\ &\frac{\partial \Lambda(B,X,V)}{\partial B}=(V'+V)B=0,\\ &\frac{\partial \Lambda(B,X,V)}{\partial B}=(AX^{-1}A'-X^{-1}A'XAX^{-1})+(X^{-1}A'VAX^{-1})-(X^{-1}VX^{-1})=0. \end{align}

then we define $g(B,X,V)=\inf_{B,X} \Lambda(B,X,V)$ and dual problem becomes: $\max_V g(B,X,V)$.

First we need to find what $B$ and $X$ minimize $g(B,X,V)$. Assuming that $V'+V$ and $B$ are both nonzero, I found that either $V'+V$ or $B$ must be singular to satisfy $(V'+V)B=0$.

Lee
  • 1,910
  • 12
  • 19
  • Are you assuming $X$ is fixed, or are you considering it as a variable in the original optimization problem? – brenderson Aug 06 '20 at 20:36
  • @brenderson It is a variable. First, I was thinking that if $B$ is found, $X$ is unique, but now I understand that both $B$ and $X$ should be considered as variable. I will edit the question, thanks – Lee Aug 07 '20 at 03:56