Questions tagged [markov-process]

A stochastic process satisfying the Markov property: the distribution of the future states given the value of the current state does not depend on the past states. Use this tag for general state space processes (both discrete and continuous times); use (markov-chains) for countable state space processes.

A Markov process is a stochastic process satisfying the Markov property: the distribution of the future states given the value of the current state does not depend on the past states. This tag is used for general state space processes both in discrete and continuous time, for countable state spaces use .

2574 questions
8
votes
5 answers

Have any discrete-time continuous-state Markov processes been studied?

I have seen discrete-time discrete-state Markov processes (such as random walks), continuous-time discrete-state Markov processes (such as Poisson processes), and continuous-time continuous-state Markov processes (such as Brownian motions). I was…
Tim
  • 47,382
4
votes
0 answers

Markov operators

Transition probability functions can always be used to generate Markov operators, correct? So is it correct to say that a Markov process is a collection of Markov operators? On the other hand, are there Markov operators NOT generated by transition…
user110342
  • 41
  • 1
4
votes
1 answer

Markov chain probability rain

Probability of rain today given rain yesterday is $0.6$, $0.2$ otherwise. (a) What's the P(rain on the day after tomorrow if it rained today)? (b) What's mean number of days of a rainy period? (c) What fraction of days does rain fall? (b) Really not…
user704971
4
votes
3 answers

How to understand explosion in continuous-time Markov chains?

It is well known that for example in a pure-birth process, explosion occurs when $$ \sum_{n=1}^\infty \lambda_n^{-1} < \infty $$ where $\lambda_n$ is the birth-rate of a new individual when the population size is $n$. For instance, explosion…
p-value
  • 474
3
votes
1 answer

For a generator $G$ of a Markov process in continuous time and finite state space, how would one prove that the entries of $e^{tG}$ are non-negative?

I have a generator matrix G for a Markov chain in continuous time and finite state space and I am looking to prove that the entries of $e^{tG} \geq 0 $ By definition $G = P'(0)$ with entries $g_{ij} = p_{ij} '(0)$ Where $p_{ij}(t) = Prob[X_j = t |…
Siz
  • 33
3
votes
1 answer

Question on Markov property for Itô diffusion : does it implies time homogeneity as well?

Let $(X_t)$ a diffusion process. I denote $\mathbb P^x$ the measure : $$\mathbb P(C(t_1,...,t_n,A_1,...,A_n)\mid X_0=x),$$ for all cylinders $$C(t_1,...,t_n,A_1,...,A_n)=\{X_{t_1}\in A_1,...,X_{t_n}\in A_n\}.$$ So, I know that $(X_t)$ has Markov…
kola
  • 174
3
votes
0 answers

Conditional probability, a question concerning Kipnis and Cocozza paper from 1977

In the paper Existence de processus Markoviens pour des systèmes infinis de particules by Cocozza, C. and Kipnis, C. (Ann. lnst. H. Poincaré, Sect. B, 13, 239-257, 1977), one reads A transition probability from $H_1$ to $H_2$ is a function $P: H_1…
3
votes
0 answers

Why is this semigroup not strongly continuous on $C(\mathcal{E})$?

In the article of Liggett and Spitzer "Ergodic Theorems for coupled Random Walks and Other Systems with Locally Interacting Components" 1981 Z. Wahrscheinlichkeitstheorie in page 445 one reads: I don't see why $S(t)$ is not a strongly continuous…
3
votes
1 answer

Random variables in non separable espaces ( a question from Ethier & Kurtz 1986 pg 128)

In page 128 of Ethier and Kurtz(1986 - Markov processes, convergence and characterization) one reads: What is the converse here? A stochastic process with sample paths in $D_{E}[0,\infty)$ is a $D_E[0,\infty)$- valued random variable. If so…
3
votes
1 answer

Stationary distribution of a birth and death process

I'm supposed to determine the stationary distribution, when it exists, for a birth and death process having constant parameters $\lambda_n=\lambda$ for $n=0,1,2,...$ and $\mu_n=\mu$ for $n=1,2,...$ My attempt: This looks like a steady-state…
2
votes
0 answers

Advanced reference in Markov processes

I am interested in a book which covers the more in depth stuff on continuous time Markov processes e.g. semi-groups, generators... Preferably such book would also contain a list of analysis results which is needed. (in the appendix or as a stand…
Lost1
  • 7,895
2
votes
2 answers

expected number of jumps of a Markov chain

Let $X = (X_t)_{t\geq 0}$ be a Markov Chain with states $s_1$ and $s_2$. Suppose $X_0=s_1$. The times $X$ stays in $s_1$ before jumping to $s_2$ are independent and exponentially distributed with parameter $\mu_1$. Likewise, the times $X$ stays in…
2
votes
1 answer

Markov Process with Stationary Distribution

I have the following problem: If I have a markov process with stationary distribution. The state space for the MP is integers. I also know that $P_{i,j}>0$ for all i and j. It is also given that $X_0=0$. The question is how to show that…
Solver
  • 173
2
votes
0 answers

Transition operator in MDP

This is a screenshot taken from a lecture about reinforcement learning: Why the equation marked with green is true? I can see that (from the law of total probability): $$\mu_{t+1,i} = \sum_{j,k}p(s_{t+1}=i|s_t=j,a_t=k)p(s_t=j,a_t=k)$$ but to get…
2
votes
1 answer

Problem with Markov decision process in Reinforcement Learning

I don't know if I understand correctly. The base situation is explained in the following image: formula: $s$ = state, $a$ = action, $A$ = set of actions, $S$ = set of states, $s'$ = next state, $P_{ss'}^a$ matrix of probabilities. $$V_{\pi}(s) =…
1
2 3 4 5