1

My problem is of theoretical nature. Given an overdetermined system of $m$ equations in $n$ unknowns, $\bf A x = b$, where $m \gg n$ and

$$ {\bf A} = \begin{bmatrix} — {\bf a}_1 — \\ — {\bf a}_1 — \\ \vdots \\ — {\bf a}_m — \end{bmatrix} \in \mathbb{R}^{m \times n}, \qquad {\bf b} \in \mathbb{R}^{m \times 1}$$

where ${\bf a}_i \in \mathbb{R}^{1 \times n}$ are the rows of the matrix. I am solving $\bf Ax=b$ via the pseudoinverse in Matlab, which computes the singular values, discards the ones below a tolerance (automatically set), then inverts the rest, which leads to

$$ {\bf A}^+ = {\bf V} {\bf \Sigma}^+ {\bf U}^\top. $$

Then I can measure the solution error as

$$\frac{\| {\bf A} {\bf A}^+ {\bf b} - {\bf b} \|_2}{\| {\bf b} \|_2}$$

My problem is that I want to do a linear combination of the matrix lines (system equations), say $\bar{b} = [b_1, b_1 + b_2, \dots, b_1 + b_2 + \dots +b_m], \quad \bar{A} =[ a_1^\top, a_1^\top + a_2^\top, \dots, a_1^\top + a_2^\top + \dots + a_m^\top]^\top \in \mathbb{R}^{m \times n}$. From elementary mathematics, we know that, if matrix $A$ is invertible, then $\bar{A}$ is still invertible, so the solution is (theoretically) exact. However, in my case due to the nature of the problem, I have $m\gg n$, thus no invertibility.

Therefore, I would like some link between $\frac{\|A \times A^+ b - b\|}{\|b\|}$ and $\frac{\|\bar{A} \times \bar{A}^+ \bar{b} - \bar{b}\|}{\|\bar{b}\|}$. Numerically, sometimes one is larger, sometimes the other.

I think the problem amounts to: how do the singular values change when I manipulate $\bf A$ linearly to get $\bf \bar{A}$ (cumsum of lines like in this example)? Numerically, the singular values of $\bf \bar{A}$ are much larger, then drop much faster towards zero. Interestingly, the pinv routine in Matlab selects the same number of singular values above its predefined tolerance (tol=max(size(A))*eps(norm(A))) for both $\bf A$ and $\bf \bar{A}$.

Does anyone know how to link the singular value representation of the two matrices, and eventually the inversion error? Much appreciated!

P.S: my problem is not one of implementation, but rather theoretical.

Let $\bf \bar{A}=LA$ where $\bf L= \begin{bmatrix} 1\ 0\ 0\ \dots\ 0 \\ 1\ 1\ 0\ \dots\ 0 \\ \vdots \\ 1\ 1\ 1\ \dots\ 1 \end{bmatrix} $. Then $\bf LA=LU\Sigma V^\top$. By qr factorization we get $\bf LU=Q_L R_L$ such that $$ \bf \bar{A}=Q_LR_L\Sigma V^\top. $$

Finally, $\bf R_L\Sigma$ can be expanded using svd and thus $$ \bf \bar{A}=Q_L U_1 \Sigma_1 V_1^\top V^\top=(Q_L U_1) \Sigma_1 (V V_1)^\top. $$ Thus, $\bf \bar{A}$ has the singular values of $\bf R_L\Sigma$ and $\bf Q_L U_1, V V_1$ are unitary matrices. Seems like it's by no means a simple solution. I am next looking to find analytically what is $\bf R_L$ in $\bf LU=Q_L R_L$, as it's also a triangular matrix as $\bf L$.

  • $\bar A = L A$. Can you find matrix $L$? – Rodrigo de Azevedo Jun 06 '23 at 06:15
  • @RodrigodeAzevedo thank you for editing the question it looks better. You are right about the pseudoinverse, that was a typo. And yes, you can write it this way, such that $\bf L$ is a lower triangular matrix with only ones below the diagonal. Then I can write $\bf LA=LU\Sigma V^\top$. However, $\bf LU$ is no longer containing orthogonal vectors, so how would it lead to the SVD expansion of $\bf \bar{A}$? Thanks! – Jason Burton Jun 06 '23 at 11:43
  • You might be complicating things. For example, is $\bf I - A A^+$ a projection matrix? – Rodrigo de Azevedo Jun 06 '23 at 13:17
  • $\bf I-AA^+$ maps any vector $\bf b$ to the vector starting at the projection of $\bf b$ onto the column space of $\bf A$ and ending at $\bf b$. I am not sure if this answers your question, and I think I'm missing a piece of the puzzle so would be grateful for more hints. – Jason Burton Jun 06 '23 at 13:40
  • @RodrigodeAzevedo Comparing $\bf I-AA^+$ and $\bf I-\bar{A}\bar{A}^+$ still needs computing $\bf \bar{A}^+$ as a function of $\bf A^+$ right? – Jason Burton Jun 06 '23 at 16:06
  • Knowing $\bar A = LA$, you can do some work yourself, such as writing the projection matrices in terms of $A$ and $L$. The post will get longer, so consider breaking it into sections. 1 upvote but 44 views suggests that you are not inspiring anyone to bother with this question. Investing some time in the question may inspire – Rodrigo de Azevedo Jun 07 '23 at 10:47
  • No answer may also mean nobody knows the answer. I am thinking about it all the time. I'll write down the solution once I have it. – Jason Burton Jun 07 '23 at 11:36
  • This might be useful (or not) – Rodrigo de Azevedo Jun 07 '23 at 11:39
  • Why did you create a new account? – Rodrigo de Azevedo Jun 07 '23 at 20:43
  • I was on a new pc where I couldn't log on with the same one. Is it an issue? – Jason Burton Jun 07 '23 at 21:05
  • Later, you can merge them. – Rodrigo de Azevedo Jun 07 '23 at 21:15

0 Answers0