Questions tagged [markov-chains]

Stochastic processes (with either discrete or continuous time dependence) on a discrete (finite or countably infinite) state space in which the distribution of the next state depends only on the current state. For Markov processes on continuous state spaces please use (markov-process) instead.

A Markov chain is a stochastic process on a discrete (finite or countably infinite) space in which the distribution of the next state depends only on the current state. These objects show up in probability and computer science both in discrete-time and continuous-time models. For Markov processes on continuous spaces please use .

A discrete-time Markov chain is a sequence of random variables $\{X_n\}_{n\geq1}$ with the Markov property, namely that the probability of moving to the next state depends only on the present state and not on the previous states, i.e. $$\mathbb P(X_{n+1}=x\mid X_{1}=x_{1},X_{2}=x_{2},\ldots ,X_{n}=x_{n})=\mathbb P(X_{n+1}=x\mid X_{n}=x_{n}),$$ if both conditional probabilities are well defined, i.e. if $\mathbb P(X_{1}=x_{1},\ldots ,X_{n}=x_{n})>0.$

5938 questions
34
votes
3 answers

Is ergodic markov chain both irreducible and aperiodic or just irreducible?

As I find some definition says: Ergodic = irreducible. And then Irreducible + aperiodic + positive gives Regular Markov chain. A Markov chain is called an ergodic chain if it is possible to go from every state to every state (not necessarily in one…
colinfang
  • 807
14
votes
2 answers

What does the steady state represent to a Markov Chain?

I'm a little confused as to the interpretation of the steady state in the context of a Markov chain. I know Markov chains are memoryless, in that each state only depends on its immediate predecessor, but doesn't that mean the system is in a sort of…
user29351
13
votes
2 answers

Intuition behind positive recurrent and null recurrent Markov Chains

I cannot understand how there can be positive recurrent and null recurrent Markov Chains. Markov Chains can be split up into transient and recurrent states, where recurrent means that it will be able to go back to that state sooner or later, as…
12
votes
1 answer

Expected number of steps/probability in a Markov Chain?

Can anyone give an example of a Markov Chain and how to calculate the expected number of steps to reach a particular state? Or the probability of reaching a particular state after T transitions? I ask because they seem like powerful concepts to know…
9
votes
2 answers

Can Markov Chain state space be continuous?

I looked for a formal definition of Markov chain and was confused that all definitions I found restrict chain's state space to be countable. I don't understand purpose of such a restriction and I have feeling that it does not make any sense. So my…
8
votes
2 answers

Markov chain with infinitely many states

I understand that a Markov chain involves a system which can be in one of a finite number of discrete states, with a probability of going from each state to another, and for emitting a signal. Thus,an $N \times N$ transition matrix and an $N \times…
Superbest
  • 3,400
7
votes
1 answer

Markov chain $(X_n)$ has $X_n \rightarrow \infty$ a.s

I have the following homework problem: Let $(X_n)_{n \geq 0}$ be a Markov chain on the state space $\lbrace0,1,...\rbrace$. Writing $p_i := p_{i,i+1}$ and $q_i := p_{i,i-1}$, the transition probabilities are $$p_0 = 1, \qquad \mathrm{and}\quad p_i +…
Iain
  • 107
7
votes
2 answers

What does it mean to observe a Markov Chain after a certain kind of transition?

I'm working on a problem concerning censoring of transitions in a Markov Chain. For example, take a Markov Chain that models a counter, it goes up or down but does not stay in position. A possible censoring could be to only observe the transitions…
Wim
  • 73
7
votes
1 answer

Markov chain capture times

Given the Markov process represented by the above diagram, where the transition probabilites $q$ and $p$ satisfy $p+q=1$. What is the probability $P_r$ of ending up in the sink state $r$, supposing that the initial state is $i$? The answer…
minmax
  • 877
7
votes
2 answers

Expected time till absorption in specific state of a Markov chain

This question is a follow-up to Expected number of steps for reaching a specific absorbing state in an absorbing Markov chain because I don't understand the answer given there. I think I need to see a concrete example. Suppose I play red and…
saulspatz
  • 53,131
6
votes
1 answer

Markov Chains - How to calculate prob. a state is visited at least N times? what about Expectation?

In Markov chains, if I was given a transition probability matrix with each of the probabilities specified, then how do I determine the following: 1- Probability that state y is visited at least n times given that you start in state x. I know that I…
Solver
  • 173
5
votes
1 answer

Levin & Peres 2017 Ex. 4.1: prove that $\overline{d}(t)=\sup_{\mu,\nu}\lVert\mu P^t-\nu P^t\rVert_{TV}$

Definition Let $S$ be a finite set. For any probability distributions $\mu,\nu$ on $S$, $$\lVert\mu-\nu\rVert_{TV}:=\max_{A\subset S}|\mu(A)-\nu(A)|\text{.}$$ Let $P$ be a transition kernel with state space $S$. For any positive interger…
user912011
5
votes
1 answer

Markov Chain on an infinite binary tree

Consider an infinite binary tree in which its vertices can be represented as finite strings of 0's and 1's. Let $\emptyset$ denote the top vertex aka the "root" of the tree. Let $0
5
votes
1 answer

What's the difference between stationary and invariant distribution of Markov chain?

What's the difference between stationary and invariant distribution of Markov chain? Since if the stationary distribution $\pi$ is defined as $$\pi=\pi P$$ for transition matrix $P$. Then by definition $\pi$ is invariant. But what's the difference…
mavavilj
  • 7,270
5
votes
2 answers

Direct proof that the stationary distribution of an irreducible chain is unique

Exercise 1.13 in the Levin-Peres-Wilmer book, http://pages.uoregon.edu/dlevin/MARKOV/markovmixing.pdf asks for a direct proof that the stationary distribution of an irreducible chain is unique. Note that Corollary 1.17 ibid. gives a very slick proof…
Aryeh
  • 440
1
2 3
30 31