3

I'm looking for an efficient algorithm to generate large positive semidefinite matrices.

One possible way I know of is:

  • generate a random square matrix
  • multiply it with its transpose.
  • calculate all eigenvalues of the result matrix and check if all of them are non-negative.

However, this approach is infeasible given a large matrix, say $1000 \times 1000$ or more.

Could anyone please suggest an efficient way to generate a positive semidefinite matrix? Is there any MATLAB function for this job?

Thanks,

chepukha
  • 231
  • 1
    P.S. For checking positive (semi)definiteness, one never needs to compute an eigendecomposition. (Pivoted) Cholesky can be modified for checking positive (semi)definiteness of your matrix, bases on the sign of the quantity to be rooted within the algorithm. – J. M. ain't a mathematician Oct 04 '11 at 05:59
  • 1
    One question that comes up is: random according to what distribution? Different methods of generating PSD will most likely generate them according to different distributions. – Raskolnikov Oct 04 '11 at 11:02
  • 1
    "this approach is infeasible" : Of your three generastion steps, the third is a rather a test, not a generation, it's by far the most computationally intensive and it's not the way to test (see JM comment). You should take that step out of the list to make it clear whether your performance problem is tied to it or not. – leonbloy Oct 04 '11 at 12:12
  • Over any field: $A^{T}DA$ is PSD, where $A$ is any matrix, and $D$ is a random diagonal matrix (given that the field is large enough so you can choose $n$ random elements for the diagonal; otherwise, you'll have to construct an extension). –  Dec 31 '11 at 04:57
  • Any Hermitian diagonally dominant matrix with real non-negative diagonal entries is positive semidefinite. – Sudix Jul 17 '19 at 19:15

4 Answers4

7
  1. Generate a random symmetric matrix, determine eigenvalue bounds via, say, Gerschgorin, and then shift the diagonal elements by an appropriate amount determined from the Gerschgorin bound of the leftmost eigenvalue.

  2. Generate a diagonal matrix with random nonnegative elements from a distribution of your choice, and perform a similarity transformation with a Haar-distributed pseudorandom orthogonal matrix.

  3. Generate a diagonal matrix with random nonnegative elements from a distribution of your choice, and perform a few sweeps of the (cyclic) Jacobi algorithm, with randomly generated rotation matrices $\begin{pmatrix}c&-s\\s&c\end{pmatrix}$ (e.g., randomly generate a $c\in [-1,1]$ and calculate a corresponding $s$ through $c^2+s^2=1$).

  • +1 Thanks a lot for your input. However, I'm not a mathematician so it's too hard for me to grab it without a proof. So I stick with Jay's solution as it's easy for me to understand. – chepukha Oct 04 '11 at 14:34
6

Generate some "random" vectors $\mathbf v_1,\dots, \mathbf v_m$ and "random" non-negative scalars $c_1, \dots, c_m$ and compute

$$\mathbf P=c_1 \mathbf v_1\mathbf v_1^\top+\cdots+c_m \mathbf v_m\mathbf v_m^\top$$

dmuir
  • 436
  • +1 Thanks for your input dmuir. I'm not able to prove that P is PSD but let the community vote up your answer. Thank you. – chepukha Oct 04 '11 at 14:37
  • @chepukha To show that $v v^\mathsf{T}$ is PSD for any $v \in \mathbb{R}^n$, see user13838's comment above, except replace $A$ with $v$. Then note that the sum of PSD matrices is still PSD. (The link I shared is for positive definite matrices, but the same proof works for PSD matrices.) – kanso37 Aug 24 '20 at 16:02
5

Pick an inner product in $\mathbb R^m$ or in $\mathbb C^m$, a set of vectors $v_1$, $\dots$, $v_n$ in that space, and consider the $n\times n$ matrix $A=(a_{i,j})$ with $a_{i,j}=\langle v_i,v_j\rangle$. This is called the Gramian matrix of the vectors you started with, it is always positive semidefinite, and in fact every positive semidefinite matrix is the Gramian matrix of some set of vectors.

4

Any matrix multiplied by it's transpose is going to be PSD; you don't have to check it. On my computer raw Octave, without SSE, takes 2 seconds to multiply a 1000x1000 matrix with itself. So not all that infeasible.

If you don't like that, you can always just generate a random diagonal matrix. That's sort of the trivial way, though :) What do you need the matrix for? If it's as test input to another algorithm, I'd just spend some time generating random PSD matrices using the above matrix-matrix multiplication and save the results off to disk.

Jay Lemmon
  • 2,168