3

I've been working on solving some linear equations arising from different optimization problems, but I keep getting stuck. Right now I have the problem below:

I am trying to solve the system of equations for $x$:

$$ Ax-\alpha \frac{Bx}{ x^tBx}=c$$ $$e^tx=1$$ where $e=(1,...,1)^t$.

where $x,c\in \mathbb{R}^n$, both $A,B \in \mathbb{R}^{n\times n}$ are positive definite and indeed even $$A-\frac{\alpha B }{x^t B x}$$ is positive definite, so we have nice invertibility properties.

Any help, references, or much better - a solution - is very much appreciated!

EDIT: Some further work below.

If we set $x=\sqrt{B}^{-1}z$ we get $$A\sqrt{B}^{-1}z-\alpha\frac{\sqrt{B}z}{z^tz}-c=0$$ or for $D=\sqrt{B}^{-1}A\sqrt{B}^{-1}/\alpha$ and $p=\sqrt{B}^{-1}c/\alpha$ $$0=Dz-\frac{z}{z^tz}-p\Leftrightarrow (D-\frac{pz^t}{z^tz})z=\frac{z}{z^tz}.$$ So it appears that $z$ is a multiple of an eigenvalue of a matrix that in turn depends on $z$. Is there any way I can extract analyical solutions!?

Winston
  • 864
  • So, I was having a think, in the case of $c=0$ (which isn't really that interesting but gotta start somewhere) we can transform the system into $Dz=\frac{z}{z^tz}$, so $z$ is a multiple of an eigenvector. Can we then somehow use linearity to extract a solution for $c\neq 0$? – Winston Jul 17 '15 at 11:00
  • If you multiply both sides by $x^T$ on the left, then you get: $$x^TAx - \alpha = x^T c \iff x^T(Ax - c) = \alpha.$$Maybe that could help. I can't say it's equivalent to the original equation, but it reduces the possible solutions. – Theo Bendit Jul 17 '15 at 12:16
  • You can decompose $x = e+\delta$ where $\delta\cdot e = 0$. This might open up some new avenues. –  Jul 17 '15 at 12:58
  • I spent some time trying those approaches. Thanks for the tip, but I am afraid I cannot see it leading anywhere.

    While browsing my library on linear algebra I encountered some sections on Hankel and Bezout forms. I only glanced quickly at it before having to leave for a lunch, but thought that it looked useful. Does anyone who has experience in the area have any suggestions as to its applicability?

    – Winston Jul 17 '15 at 13:54

1 Answers1

1

If we decompose the vector $x$ as a component along and orthogonal to $e$, (i.e., $x=e+\Delta:\Delta \cdot e = 0$) and then apply this to the simplified form provided by @TheoBendit, we get:

$$(e+\Delta)^t(A(e+\Delta)-c)=\alpha \implies$$ $$e^tAe+\Delta^tAe+e^tA\Delta+\Delta^tA\Delta-e^tc-\Delta^tc = \alpha \implies $$ $$ \langle\Delta,\mathbf{a}_{\cdot j}\rangle+ \langle\Delta,\mathbf{a}_{i\cdot}\rangle +\mathbf{Q}_{A}(\Delta)-\langle c,\Delta\rangle=\alpha-\sum a_{ij}+\sum c_i \equiv K$$

So, we've (reduced?!) this to a quadratic equation in $n$ variables. If we let $\delta_i$ be the $i$th component of $\Delta$ then:

$$ \left(2\sum a_{ij}-\sum c_i\right) \sum \delta_i+\sum\sum a_{ij}\delta_i\delta_j-K=0 $$

Subject to:

$$\sum \delta_i = 0$$

Let $\mathbf{v}:=(v_1,v_2,...,v_n)$ be a solution to this formula, then your $x$ is $x=v+e$. There will likely be several solutions to this formula, but since its a quadratic form, you can apply any number of multivariate root finding algorithms to solve it. Or you can search the trove of answer on MSE. Here enter link description here.

Also, the general problem of solving underdefined quadratic systems (like that here), has been studied. See this paper and here.

  • Thank you very much for your time. I'm not 100% sure this question is settled yet but I am more and more accepting the possibility of having to go numerical/use computer. Anyway, I have one question in regard to solving the underdetermined system: How do we know that a solution to the undertermined system is a solution to the original system? To me it seems a bit like throwing out n-1 constraints, closing your eyes and praying. I mean, I see that any solution to the original system satisfies the simplified one, but not necessarily vice versa? Or am I missing something? Thanks again! – Winston Jul 17 '15 at 14:29
  • 1
    @ZMI I defined the solutions $\delta_i$ to be part of a vector that is orthogonal to $e$. Therefore, this solution is not relevant to the constraint on the inner product with $e$ and we have a new, unconstrained problem on a quadratic manifold within $\mathbb{R}^n$ –  Jul 17 '15 at 14:46
  • 1
    @ZMI so, lets say that $v$ is a solution to the simplified problem, then by adding $e$ to it, we ensure it not only satisfies your matrix equation, but also that its dot product with $e$ is 1. This will be true for any set of $\delta_i$ that are roots of the quadratic form. –  Jul 17 '15 at 14:48
  • 1
    @ZMI the simplified form $x^T(Ax-c) = \alpha$ does not care how we decompose $x$ as long as we do it consistently. This is due to the properties of vector spaces. –  Jul 17 '15 at 14:49
  • I see one problem, but I guess this is minor: Are we using the same $e$? Because as far I can see $e^tx=e^t(e+\Delta)=e^t e + e^t \Delta = e^t e =1+...+1=n \neq 1$. Maybe you wanted to decompose $x=\Delta +e/n$? I guess this should then give your result slightly altered. – Winston Jul 17 '15 at 15:00
  • 1
    @ZMI you are correct. For some reason, Is tarted treating $e$ like a unit vector! –  Jul 17 '15 at 15:03
  • Hm, okay, I did not realize that an $x$ solving $x^t f(x)=0$ also solves $f(x)=0$ I'll have to brush up on linear algebra I guess. (With $f(x)$ appropriate for the problem above) – Winston Jul 17 '15 at 15:04
  • Yeah, I can imagine what reason that is. The notation is a bit weird but I started running out of good symbols! – Winston Jul 17 '15 at 15:06
  • 1
    @ZMI I think the analogy here is $y^tf(y)=0 \implies x^tf(x)=\alpha$. All I did was re-express the problem using vectors within the hyperplane defined by $c\cdot \Delta =0$ –  Jul 17 '15 at 15:35