I'd like to mention an alternative to Prof. Hamkins' answer. His answer seems to be natural from the point of view of lattices; what I mention below can be interpreted as natural from the point of view of measure/information/entropy/ergodic theory, and it goes back to Rohlin (at least in the way I present it here).
(I won't get into detail after this section.)
Let $(M,\mathcal{B},\mu)$ be a probability space that is measure theoretically isomorphic to the closed unit interval with Lebesgue $\sigma$-algebra and Lebesgue measure. (Such spaces are called Lebesgue or Lebesgue-Rohlin or standard or standard Borel.)
By a $\mu$-ae partition of $(M,\mathcal{B},\mu)$ I mean an $\alpha\subseteq \mathcal{B}$ such that
- $\forall A_1,A_2\in\alpha: A_1\neq A_2\implies\mu(A_1\cap A_2)=0 $,
- $\mu\left(M\setminus \bigcup_{A\in \alpha}A\right)=0$.
(So a $\mu$-ae partition is a collection of measurable subsets which partitions $M$ up to $\mu$-negligible sets.)
Denote by $\text{aefPar}(M,\mathcal{B},\mu)$ be the collection of all $\mu$-ae partitions of $(M,\mathcal{B},\mu)$, and by $\text{aefPar}(M,\mathcal{B},\mu), \text{aecPar}(M,\mathcal{B},\mu)$ and $\text{aemPar}(M,\mathcal{B},\mu)$ be the subcollections of finite, countable and measurable $\mu$-ae partitions of $(M,\mathcal{B},\mu)$, respectively. For the last collection defined, the adjective "measurable" signifies a certain separation property (and not just that each element of the partition is a measurable set): A partition $\alpha$ is measurable if the natural projection $M\to M/\alpha$ admits a Fubini Theorem.
Note that we are doing everything $\mu$-ae and $\mu$ is fixed, so we may safely consider only Borel subsets of $M$.
We can consider the existence of (ae-)refinements of these collections as an order relation as in Prof. Hamkins' answer.
I should note that the gnarly notation above is due to me; Rohlin himself uses $Z_1$ for $\text{aefPar}(M,\mathcal{B},\mu)$ and does not use any notation for the other collections, and nowadays typically no notation for any of these is in common use as far as I know.
To any measurable ae-partition one can associate a (possibly infinite) nonnegative number called entropy, and similarly there is a notion of conditional entropy for any pairs of measurable ae-partitions; it is traditionally denoted by
$$H(\bullet\vert\bullet)=H_\mu(\bullet\vert\bullet):\text{aemPar}(M,\mathcal{B},\mu)\times \text{aemPar}(M,\mathcal{B},\mu)\to [0,\infty],$$
and $H=H_\mu= H_\mu(\bullet\vert \{M\})$ recovers the unary entropy, $\{M\}$ being the indiscrete partition of $M$.
(See Why is "h" used for entropy? for the claim that this letter is capital eta.)
I will not give the definition of these numbers here; see e.g. https://www.merry.io/dynamical-systems/26-partitions-and-the-rokhlin-metric/ for an account that seems to be very closely following Katok & Hasselblatt's Introduction to the Modern Theory of Dynamical Systems, pp.161-167 (though what they call a "measurable partition" there is what I call a finite or countable ae-partition here; my terminology is in sync with Rohlin's and also coincidentally the current common nomenclature in smooth ergodic theory as far as I know).
Conditional entropy has many nice properties compatible with the order relation we mentioned above. As its name would suggest, it is not symmetric; symmetrizing it (e.g. by considering $d_\mu:(\alpha,\beta)\mapsto H_\mu(\alpha\vert\beta)+H_\mu(\beta\vert\alpha)$) gives a (possibly infinite) distance function on $\text{aemPar}(M,\mathcal{B},\mu)$:
$$d_\mu:\text{aemPar}(M,\mathcal{B},\mu)\times \text{aemPar}(M,\mathcal{B},\mu)\to [0,\infty].$$
The common practice is to start with the smaller and more well-behaved subcollections and then verify that the definition admits extensions to the larger collections (One can even go beyond measurable ae-partitions; it is not reasonable to expect the ae-partitions coming from dynamical systems to be measurable. I won't get into these matters here.). Following Rohlin, let us focus on the subspace $\{H_\mu<\infty\}$ of measurable ae-partitions with finite entropy (Rohlin denotes this subspace by $Z$).
Theorem (Rohlin):
- $(\{H_\mu<\infty\},d_\mu)$ is a complete separable metric space.
- $\text{aefPar}(M,\mathcal{B},\mu)$ is dense in $(\{H_\mu<\infty\},d_\mu)$.
- $\text{aemPar}(M,\mathcal{B},\mu)$ has a Polish space structure compatible with $d_\mu$ on $\{H_\mu<\infty\}$ is dense. W/r/t this structure $\{H_\mu<\infty\}$ is dense in $\text{aemPar}(M,\mathcal{B},\mu)$.
- Both the unary and the binary $H_\mu$ are continuous w/r/t these structures (to say the least).
I should mention that in practice the finiteness of entropy is often automatic so infinite entropy is often not a big problem.
For more details and proofs of the above statements see Rohlin's paper "Lectures On The Entropy Theory Of Measure-Preserving Transformations".
Finally in the Katok-Hasselblatt book and the notes I've given a link to above another distance function is defined on $\text{aefPar}(M,\mathcal{B},\mu)$, which is essentially an extrapolation of the idea of metrizing the measure algebra by the measure of the symmetric difference. This measure distance is compatible in a certain sense with the entropy distance as is noted in the notes and the Katok-Hasselblatt book. Katok-Hasselblatt calls the entropy distance the Rohlin distance.
I claimed in the beginning that the topology I'll be describing will be natural. This claim is based on the naturality of the notion of conditional entropy as used here: it is the unique (up to a normalization) function that satisfies certain properties one would expect from a quantification of surprise. (The notes I've cited above attributes the standard theorem along these lines to Khinchine; see also the exercise on p. 11 of https://math.huji.ac.il/~mhochman/preprints/info-theory.pdf)