3

Is there a way to sample from a probability distribution on the space of positive definite $3\times3$ matrices with some constraints? I'm looking for any examples of such schemes.

In particular, I'd be interested in looking at matrices with $\alpha_1 < \det(M) < \alpha_2$ and $ \beta_1 < \text{tr}(M) < \beta_2 $, where $\alpha_i,\beta_i >0$. I am aware these may not be sufficient to bound the space.

It would be nice to not only have a uniform distribution on such a space, but also have a Gaussian-like distribution (where I would have some matrix $M$ set as the mean of the distribution, such that one could sample around it).

But any thoughts/literature on the topic would be nice. I suspect it might be easier if $M$ were symmetric.


Initial ideas: since $ \text{tr}(M) = \sum_i\lambda_i $ and $\det(M)= \prod_i\lambda_i$, I was thinking of drawing random eigenvalues such that the constraints are satisfied. Another constraint may need to be added (3 constraints for 3 eigenvalues). Then I would sample $P$ as an orthogonal matrix and take $P\,\text{diag}(\lambda)P^{-1}$ (not sure this is the right thing to do). However, I have no idea how "uniform" such a distribution would be in the space of PD matrices, nor does this seem very efficient. Further, whether it covers the whole space is not clear.


Edit 1: the "Related" questions bar suggested this question, which is similar but restricted to symmetric matrices. It links to the Wishart distribution, which led me to the matrix Gamma distribution and inverse matrix Gamma distribution. It would be nice to find something allowing sampling from asymmetric PD matrices.

On the other hand, a matrix $M$ is PD iff its symmetrized version $M+M^T$ is PD. (e.g. here). (Maybe this can come in handy? E.g. if one specifies the eigenvalues, could one generate an SPD matrix, and then perturb it in a principled way).

user3658307
  • 10,433
  • 1
    Here's a demotivating example: The space of $2\times2$ s.p.d. matrices $M=\begin{bmatrix}a+b&c\c&a-b\end{bmatrix}$ with $\frac12\le\det M\le2$ and $3\le\operatorname{tr}M\le4$ looks like this: https://i.stack.imgur.com/lIEl3.png, so sampling it uniformly may not be easy. And you can forget about a Gaussian distribution: the mean of the set is not even in the set. –  Jun 03 '17 at 04:56
  • @Rahul That's really cool, thank you. It's only a little demotivating :) when I said Gaussian distribution I really meant a distribution "centered around" a given matrix $M$, which would be in the set (obviously this would have issues near the boundary though...). Is there not a pleasant analytic expression for that cone-like volume set though? – user3658307 Jun 03 '17 at 05:47
  • 1
    Sure, the volume is determined by the inequalities and the facts that $\det M=a^2-b^2-c^2$ and $\operatorname{tr}M=2a$. But again, I don't know how to sample from this set uniformly, except maybe by rejection sampling. –  Jun 03 '17 at 06:21
  • 1
    If you're just interested in doing so in practice, you can sample from all matricies (say sample a gaussian for each coefficient), check if it meets your constraints, and if not throw out everything and repeat the process.

    Not guaranteed to halt in any amount of time, but it should sample like a gaussian intersected with the space.

    – Artimis Fowl Jun 03 '17 at 07:02
  • @ArtimisFowl Thanks for the suggestion! It's not really clear to me how distributions on the components map to distributions on the space, but that certainly seems reasonable. The main issue is computational efficiency, since I am indeed interested in doing it in practice, so I was hoping to avoid rejection sampling :) – user3658307 Jun 03 '17 at 12:11
  • 1
    @Rahul Yeah, I was thinking there might be a more geometric representation (e.g. in some other coordinate system maybe how to sample it would be more clear). – user3658307 Jun 03 '17 at 12:24
  • 1
    I don't know what this would do to the distribution, but you could also sample all but a $2 \times 2$ minor of the matrix, and then assign values to that region so that the matrix satisfies your constraints. You should be able to meet both conditions, and any further degrees of freedom could also be randomly selected. The only trouble would be a $0$ eigenvalue, but that happens with an absurdly low probability iirc. – Artimis Fowl Jun 03 '17 at 21:48

0 Answers0