5

What's the difference between stationary and invariant distribution of Markov chain?

Since if the stationary distribution $\pi$ is defined as

$$\pi=\pi P$$

for transition matrix $P$. Then by definition $\pi$ is invariant. But what's the difference then?

mavavilj
  • 7,270
  • 1
    Are you getting the definitions of these from some sort of source that uses both terms? If so, what definitions does it give? – Misha Lavrov Feb 22 '18 at 05:36

1 Answers1

11

Usually, these are just terms used by different people; some will call a vector $\pi$ with $\pi P = \pi$ and $\sum_i \pi_i = 1$ a stationary distribution, others will call it an invariant distribution.

However, there are some closely related concepts that are different:

  • An invariant measure (or maybe stationary measure) is sometimes a vector $\pi$ that satisfies $\pi P = \pi$, but not necessarily $\sum_i \pi_i = 1$. (This makes a difference for infinite Markov chains, where we can't necessarily divide by $\sum_i \pi_i$ to normalize.) But some sources like Wikipedia use this synonymously with a stationary distribution.
  • A limiting distribution is a stationary distribution $\pi$ with the property that for any distribution $\rho$, $\lim_{n \to \infty} \rho P^n = \pi$: after taking lots of steps starting at $\rho$, we converge to the distribution $\pi$. These do not necessarily exist for all Markov chains, or are dependent on $\rho$.
  • A time-average distribution $\pi$ is defined by letting $\pi_i$ be the average fraction of time spent in state $i$ over $n$ steps, in the limit as $n \to \infty$. This is sometimes dependent on the initial state. It is equal to the limiting distribution if that exists, but the time-average distribution exists in slightly more general cases.
Misha Lavrov
  • 142,276
  • A limiting distribution is also often referred to as an equilibrium distribution – jII Apr 28 '22 at 03:41
  • @jll I imagine it is the stationary or invariant distribution that's referred to as the equilibrium distribution (though that's splitting hairs). An equilibrium is a state that you stay in, if you are in it. You do not necessarily approach an equilibrium. (Not everyone may agree, of course.) – Misha Lavrov Apr 28 '22 at 04:52
  • 1
    I agree that the terminology varies. For example, Tierney 1994 has:

    "The invariant distribution $\pi$ is an equilibrium distribution for the chain if for $\pi$-almost all $x$, $\lim_{n \to \infty}P^n(x,A) = \pi(A)$"

    [Tierney 1994] Ann. Statist. 22(4): 1701-1728 (December, 1994). DOI: 10.1214/aos/1176325750 has

    – jII Apr 28 '22 at 05:13
  • That's a very interesting definition because of the $\pi$-almost-all. If I were defining "limiting distribution", I would say "for any starting state" or "for every starting state" but this is definitely not the same as the second and might not be the same as the first, either. – Misha Lavrov Apr 28 '22 at 15:32