2

It is well known that, for a dynamical system $T$ on a metric space $(X,d)$, the variational principle connects the definition of metric entropy and topological entropy. In other words, if $$M(X,T) := \{ \mu\,\, \text{probability measure} : \mu= T_*\mu \} $$ is the set of invariant measures for $T$, then

$$h_{top}(T)= \sup_{\mu \in M(X,T)} h_{\mu}(T) $$

where $h_{top}(T)$ is the topological entropy and $h_{\mu}(T)$ is the metric entropy relative to $\mu$.

I have seen somewhere that, if we denote by $E(X,T) \subset M(X,T)$ the set of invariant ergodic measures for $T$, then

$$h_{top}(T)= \sup_{\mu \in E(X,T)} h_{\mu}(T) $$

My questions are: is this true? If it is true, how is it possible to prove it?

Alp Uzman
  • 10,742

1 Answers1

2

It is indeed true that for the purposes of the variational principle one may take supremum over ergodic invariant probability measures (at least when $X$ is compact and $T$ is continuous, which are the standard assumptions for the variational principle). I present below (the outline of) an argument (from Walters' An Introduction to Ergodic Theory, p.190, which includes further details).


Let $X$ be compact metrizable, $T:X\to X$ be continuous. Recall that we have an ergodic decomposition

$$\Pi_\bullet: M(X;T)\to M(E(X;T))$$

where the codomain is the space of Borel probability measures on the space of $T$-invariant ergodic Borel probability measures on $X$. $\Pi_\bullet$ is determined completely by the formula

$$\forall \phi\in C^0(X;\mathbb{R}): \int_X \phi(x)\, d\mu(x)=\int_{E(X;T)}\left(\int_X \phi(x) \, d\epsilon(x)\right)\, d\,\Pi_\mu(\epsilon).$$

(Here $C^0(X;\mathbb{R})$ is the space of all continuous real valued functions on $X$.)

This formula is abbreviated as $\mu=\int_{E(X;T)} \epsilon\, d\, \Pi_\mu(\epsilon)$, or alternatively in probabilistic notation $\mathbb{E}_\mu=\mathbb{E}_{\Pi_\mu}(\epsilon\mapsto \mathbb{E}_\epsilon)$, where $\mathbb{E}_\nu$ means the expectation w/r/t the anonymous measure $\nu$.

Theorem: For any $\mu\in M(X;T)$:

$$\operatorname{ent}_\mu(T)= \int_{E(X;T)} \operatorname{ent}_\epsilon(T)\, d\, \Pi_\mu (\epsilon).$$

In words, the metric entropy of $T$ w/r/t $\mu$ is the average of the metric entropies of $T$ w/r/t $\epsilon$, where $\epsilon$ runs over the ergodic components of $\mu$. This theorem is also proved in the aforementioned book (p. 186); the main idea is to prove it first for entropies w/r/t finite partitions and to use the upper-semicontinuity of entropy.


Let us denote the (extended) nonnegative number

$$\sup_{\mu\in M(X;T)} \operatorname{ent}_\mu(T) =:\operatorname{topent}(T)\in [0,\infty].$$

Like you cite of course by the variational principle this is really the topological entropy of $T$ and it has a separate definition that uses no measure theoretical structure whatsoever. So $\operatorname{topent}(T)$ is not only the top (metric) entropy, it's also the topological entropy. Though we don't need this for the corollary below which is what you are asking.

Corollary (to the theorem above): For any $\mu\in M(X;T)$:

$$\sup_{\mu\in M(X;T)} \operatorname{ent}_\mu(T)\stackrel{\tiny \text{def (for our purposes)}}{=}\operatorname{topent}(T)= \sup_{\mu\in E(X;T)} \operatorname{ent}_\mu(T).$$

Proof: Assume that $\operatorname{topent}(T)$ is finite. Let $\delta\in\mathbb{R}_{>0}$, and take a $\mu\in M(X;T)$ such that

$$\operatorname{topent}(T)\geq \operatorname{ent}_{\mu}(T)> \operatorname{topent}(T)-\delta.$$

By the ergodic decomposition of $\mu$ and the theorem above we have that at least for one $\epsilon\in \operatorname{supp}(\Pi_\mu)\subseteq E(X;T)$

$$\operatorname{topent}(T)\geq \operatorname{ent}_{\epsilon}(T)> \operatorname{topent}(T)-\delta.$$

(Recall that $\Pi_\mu$ is a probability measure.)

The infinite entropy case is similar.

Alp Uzman
  • 10,742