A centered Gaussian process is Markov if and only if its covariance function $\Gamma: \mathbb{R}\times\mathbb{R} \to \mathbb{R}$ satisfies the equality:
$$\Gamma(s,u)\Gamma(t,t)=\Gamma(s,t)\Gamma(t,u)\ \ \ \ (1)$$
for all $s<t<u$.
My Question: How can you prove this?
It turns out to be difficult to find a clear proof of the above fact. One proof which I found but did not understand well was: Example 4.5, p.119 of Random Processes for Engineers by Bruce Hajek, or Example 4.7 on pp. 120-121. The notation used in that book made very little sense to me.
Background: Inspired by my answer to this question. I had to try and prove the statement for a homework assignment, but was not very successful. A solution was explained in class, but it relied on the rather implausible
Claim: A function $h: \mathbb{R}\times\mathbb{R} \to \mathbb{R}$ satisfies equation (1) if and only if it is of the form $$h=\max(f(s,t),g(s,t))\min(f(s,t),g(s,t))$$for some two functions $f,g:\mathbb{R}\times\mathbb{R} \to \mathbb{R}$.
The sufficiency of the claim to me is clear, but the necessity is not, nor was it proven in class. Moreover, I did not understand anyway the subsequent proof which was supposed to show that the desired equation follows from this claim.
What I Have Tried: (Copy-pasted from the TeX file from my homework "solution", which I already handed in and got a grade for)
Let $X_t$ be a centered Gaussian process. Specifically, $\forall t$, $\mathbb{E}(X_t)=0$ and $X_t \sim \mathscr{N}(0,\sigma_t)$.
Now assume that ab initio $\Gamma(s,u)\Gamma(t,t)=\Gamma(s,t)\Gamma(t,u)$ for any $s<t<u$, but that we have not necessarily shown that $X_t$ is Markov.
Then $\Gamma(s,u)=\frac{\Gamma(s,t)\Gamma(t,u)}{\Gamma(t,t)}$.
Because the process is centered Gaussian, this is true if and only if (see Example 4.5, p.119 of Random Processes for Engineers by Bruce Hajek, or Example 4.7 on pp. 120-121.) $X_u$ and $X_s$ are conditionally independent given $X_t$.
This equivalence is due to the fact that the above equation implies that $$\mathbb{E}[(X_s | X_t)(X_u |X_t)]=0,$$ i.e. that the two conditional expectations are uncorrelated, and since the two random variables are Gaussian distributed, this implies even that they are independent.
This is equivalent to the Markov property (past and future are independent given the present).