Questions tagged [matrix-decomposition]

Questions about matrix decompositions, such as the LU, Cholesky, SVD (Singular value decomposition) and eigenvalue-eigenvector decomposition.

In linear algebra, a matrix decomposition or matrix factorization is a factorization of a matrix into a product of matrices. There are many different matrix decompositions; each finds use among a particular class of problems.

  • For instance, when solving a system of linear equations $Ax=b$, the matrix $A$ can be decomposed via the LU decomposition. The LU decomposition factorizes a matrix into a lower triangular matrix L and an upper triangular matrix U.

  • Similarly, the QR decomposition expresses $A$ as QR with Q an orthogonal matrix and R an upper triangular matrix.

Other decomposition techniques include: Block LU decomposition, LU reduction, rank factorisation, Cholesky decomposition, etc.

Source: Wikipedia.

2679 questions
10
votes
2 answers

What is the Cholesky Decomposition used for?

It seems like a very niche case, where a matrix must be Hermitian positive semi-definite. In the case of reals, it simply must be symmetric. How often does one have a positive semi-definite matrix in which taking it's Cholesky Decomposition has a…
Axoren
  • 2,303
8
votes
2 answers

Is there any connection between QR and SVD of a matrix?

Is it possible to draw any parallels between the SVD and QR decomposition of a matrix? Moreover, for a given matrix $\mathbf{A}\in\mathbb{R}^{n\times m}$, under what conditions, the $\mathbf{U}$ matrix coming from singular value decomposition of…
NAASI
  • 997
7
votes
2 answers

How to use LU decomposition to solve Ax = b

Using LU Decomposition how can I solve for vector $x$ in the system $Ax = b$, given $A$ and $b$. For simplicities sake where $A$ is a 3x3 matrix and $b$ is a vector of size 3. For example how to find x when: $$A= \begin{pmatrix} 3 & 1 & 2\\ 5 & 7…
John
  • 231
5
votes
2 answers

What can we say about $A$ if $AA^T$ is singular?

I have a matrix $A$ of dimensions 100x50. It turns out $AA^T$ is actually singular. I obtained the matrix $A$ as the load matrix in a Factor Analysis output. This is the first time I have encountered such a problem in Factor Analysis. What special…
honeybadger
  • 1,125
4
votes
0 answers

Why positive diagonal entries in Cholesky decomposition?

I don't understand why L must have strictly positive diagonal entries in Cholesky decomposition?: $A = L*L^{T}$
3
votes
1 answer

Householder reduction to Hessenberg form

I've read somewhere that Hessenberg decomposition is not unique unless the first column of $Q$ in $Q^{T}AQ =H$ is specified. But then, if I am given a matrix $A \in R^{n \times n}$, I can apply the Householder reduction algorithm to reduce $A$ to…
Elnaz
  • 629
3
votes
2 answers

Decomposing a symmetric matrix into a sum of products

I just realized that if a single vector generates a matrix, say, $A=aa^T$ so that $A=\begin{bmatrix} a_1^2&a_1a_2&a_1a_3\\a_1a_2&a_2^2&a_2a_3\\a_1a_3&a_2a_3&a_3^2\end{bmatrix}$ I can just look at it and say that it is going to have two zero…
2
votes
1 answer

What is the name of this matrix decomposition?

I have used a matrix decomposition in my work that a different author calls “asymmetric Schur decomposition”. Now I am trying to find a reference for this and found that the Schur decomposition is usually of a different form, namely with three…
2
votes
3 answers

Why $H^2U=UH^2$ implies $H$ and $U$ commutes?

Why does $H^2U=UH^2$ imply $H$ and $U$ commutes, where $H$ is a Hermitian matrix and $U$ is a unitary matrix? This comes from the book 'Theory of Matrices' on p277 http://www.maths.ed.ac.uk/~aar/papers/gantmacher1.pdf Now I know the implication is…
DDaren
  • 421
2
votes
0 answers

Matrix Factorization - how to initialise factor vectors

Using this paper (Wayback Machine) I have been attempting to make a matrix factorization recommender system. However, it is not clear as to what to initialise the 2 factor matrices to; which leads to my next problem: if I initialise every entry in…
monster
  • 123
1
vote
1 answer

$QR$ factorization: why $A$ and $R$ have the same rank

In $A=QR$ factorization why $R$ has always the same rank of $A$?
1
vote
1 answer

Cholesky factorization

Could someone help me better understand algorithm 23.1 of "Numerical Linear Algebra" (by Lloyd Threfethen). You can see it here . What's being calculated at each step of the outer loop? Given that $A_{k-1}=R_k^*A_kR_k$, I suppose that the answer is…
1
vote
1 answer

Is Cholesky the same as QR for this matrix?

For a symmetric-positive-definite matrix $A=\begin{bmatrix} a & b\\ b & c\\ \end{bmatrix}$ with $a\geq c$ and eigenvalues $\lambda_1\geq \lambda_2 > 0$ can we say that Cholesky factorization with a lower triangular form is the same as QR…
Elnaz
  • 629
1
vote
1 answer

Recovering one factor of a symmetric matrix product

Suppose you observe some matrix $\Sigma$ and you know it is of the form $$\Sigma = H\sigma \sigma^T H^T,$$ where $H^TH=I$ and $HH^T=P$ satisfies $P^2=P$, $P^T=P$ (i.e. it is an orthogonal projection), $H\in \mathbb{R}^{D\times d}$, $\sigma\in…
Nap D. Lover
  • 1,207
1
vote
2 answers

If $X \in \mathbb{R}^{n \times n}$ and invertible, does it always have an eigen-decomposition?

How can I prove or disprove this claim? Since $X$ is invertible, $X$'s eigenvalue $\lambda_i \neq 0$ for all $i = 1, \cdots ,n$. However, according to Wikipedia https://en.wikipedia.org/wiki/Eigendecomposition_of_a_matrix, we need $n$ linearly…
Ada
  • 179
1
2 3