12

When attempting to solve a problem of type $AX=B$ where $A$ is the matrix of coefficients, $X$ contains the variables and $B$ is the right hand side, it turned out the $A$ was singular. As a result I could not premultiply both side by $A^{-1}$. In such a case, how would I solve for the variables without having to use augmented matrices?

Here is the question:

$$\begin{pmatrix} 1&2&1&1&1 \\ 2&1&2&1&1 \\ 1&2&3&1&1 \\ 2&2&1&1&3 \\3&3&5&2&2 \end{pmatrix}\begin{pmatrix} u \\ v \\ w \\ x \\ y \end{pmatrix}=\begin{pmatrix} 6.3 \\ 6.7 \\ 7.7 \\ 9.8 \\10.9 \end{pmatrix}$$

Since $A$ is singular I cannot do $X=A^{-1}B$ which is the method I would typically use to solve for the unknowns. In later parts of the book I am studying they elude to a method with which one can use to solve for $X$, but do not further elaborate. Does anyone have an idea on how to solve this problem (or any problem in general where A is singular) without the use augmented matrices or is that the only way?

E.O.
  • 6,942

4 Answers4

7

Even when the system of equations is singular, you can find a least-squares solution by solving the system $A^{T}Ax=A^{T}b$.

Paul
  • 2,133
5

Per my comment, Gaussian Elimination is perhaps the easiest thing to do.

However:

You don't have to perform row reduction as given by the Gaussian Algorithm.

Any two systems that have row equivalent augmented matrices have the same solution set.

There is a lot of cancellation that can occur in your matrix by using row operations, other than those used in Gausian elimination; try taking advantage of that.

If you're lucky, you can obtain the solution, if any, through ad-hoc methods by examining the "sparse" row equivalent form of the augmented matrix.

Note here that the augmented matrix for your system is row equivalent to (adding multiples of row 1 to the others): $$\begin{pmatrix} 1&2&1&1&1&\ \ 6.3 \\ 1&-1&1&0&0&\ \ .4 \\ 0&0&2&0&0&\ \ 1.4 \\ 1&0&0&0&2&\ \ 3.5\\1&-1&3&0&0&\ \ -1.7 \end{pmatrix} $$

The above is row equivalent to (working with rows 2 and 5):

$$\begin{pmatrix} 1&2&1&1&1&\ \ 6.3 \\ 1&-1&1&0&0&\ \ .4 \\ 0&0&2&0&0&\ \ 1.4 \\ 1&0&0&0&2&\ \ 3.5\\0&0&2&0&0&\ \ -2.1 \end{pmatrix} $$

We see that there is no solution.

David Mitra
  • 74,748
2

Use Singular Value Decomposition to obtain a low-rank approximation of the coefficient matrix. Example: (using MATLAB for simplicity. one can easily obtain this solution by hand!)

A=[1 1;2 2];
b=[5 5];
[U,S,V]=svd(A);
A_approx=U(:,1)*S(1,1)*V(:,1)';

Now, One can use A_approx instead of A to obtain a solution in the least squares sense as @Paul mentioned above.

ashkan
  • 39
  • 1
    the above answer is incorrect!! – ashkan May 28 '15 at 05:47
  • the above answer is incorrect!! when A is not invertible, |A|=0, then Ax=b may have two forms:
    1. b=zero vector ==> homogeneus system

    Ax=0 has non-zero solutions. 2) Ax=b It usually has no solutions, but has solutions for some b. in order to obtain the solutions, one should perform gaussian elimination. Now, When does one get the solution using least-squares? when Ax=b has no solution, but because b can not lie on the column space of A. so, you can project b onto a vector p in the column space of A and solve Ax_hat=p instead of Ax=b.

    – ashkan May 28 '15 at 06:00
  • 3
    You should edit your answer to fix it, rather than pointing out what's wrong in the comments. – jvriesem Mar 22 '16 at 20:56
-3

If the matrix is singular, the equation $Ax=b$ has no solution. This means that $b$ does not lie in the range of $A$. However it is possible to find a vector $x_0$ such that the image $Ax_0$ is ``closest'' to $b$ in a sense that can made precise. This requires a metric or norm on the target space.

One example of this is the Moore-Penrose pseudoinverse of $A$.

Jerry
  • 260
  • 12
    A square matrix $A$ is singular if and only if $Ax=b$ has no solution for $some$ $b$; in general, the equation $Ax=b$ may have (infinitely) many solutions. – David Mitra Dec 21 '11 at 14:37
  • 1
    Minus one as it is misleading to say it has no solution (tout court) in the accepted answer...it is actually fairly common to have infinitely many solutions. Perhaps for the particular matrix A in the original question there is no solution, but as it stands the general statement is false.... – eric Oct 02 '15 at 20:10