1

A Gaussian Process (GP) is a collection of random variables $X_t : \Omega \to R_t$ over an index set $T$ such that for every finite collection $t_1, ..., t_n \in T$, the vector $(X_{t_1}, ..., X_{t_n})$ is a multivariate Gaussian. We can associate the mean function $\mu : T \to \mathbb{R}$ and the covariance function $C : T \times T \to \mathbb{R}$ with $$\mu(t) = \text{mean of} X_t ~~\text{ and }~~ C(t,r) = \text{Cov}(X_t, X_r)$$

The question I have is: Why is the Gaussian Process uniquely determined by the (values of) the functions $\mu$ and $C$?

Before you scream 'Duplicate' let me say that I know that this question was asked a lot of times already, for example here and here. However, the people who asked the question did mark an answer as 'correct' that was absolutely incomplete. The question I have is not why a finite number of random variables $(X_1, ..., X_n)$ that are multivariate gaussian distributed with mean $\mu$ and covariance matrix $\Sigma$ are uniquely determined by $\mu$ and $\Sigma$ but rather why the whole Gaussian Process is uniquely determined by finite subsets of it. I even happen to think that the answer to the question

Is a Gaussian Process uniquely determined by its mean and covariance function?

is either 'no' or 'the question is not precise'. Because what does it mean for two Gaussian Processes $X = (X_t)_{t}$ and $Y = (Y_t)_{t}$ to be 'equal' or at least 'equal almost everywhere'? It means that the induced mesaures of the total random variable $X = (X_t)_t$ (on the space $\prod_t R_t$, a possibly uncountably infinite product) and $Y$ are equal.

Question: Can somebody explain why the total induced measures are uniquely determined by there finite marginalizations?

  • Oops... I think I was wrong... Seems that the answer to the much stronger question below is 'yes' as well due to a thing called Kolmogorov Extension Theorem (see https://math.stackexchange.com/questions/1465414/is-a-markov-process-uniquely-determined)... Still a little unsure – Fabian Werner Jul 12 '17 at 07:50
  • 1
    Not every Gaussian process is a Markov process, but the answer is still "Kolmogorov's extension theorem", which basically says that the law of any jointly measurable collection of random variables is uniquely determined by the finite dimensional marginals. – Jason Jul 12 '17 at 17:44
  • Also, we're not talking about equality almost everywhere - we're talking about equality in distribution. For example, you could easily define two different Wiener processes $W_1$ and $W_2$ on the same space, and you can even make them independent. They have the same distribution, in the sense that the random functions $W_i:t\mapsto W_i(t)$ satisfy $\mathbb P(W_1\in A)=\mathbb P(W_2\in A)$ for any measurable subset $A$ of a suitable function space. But they do not have to be equal almost everywhere. In fact, if $W_1$ and $W_2$ are independent, then $\mathbb P(W_1(t)=W_2(t))=0$ for all $t$. – Jason Jul 12 '17 at 17:47

0 Answers0