I am confused with the way uniform integrability is defined in the context of random variables. Keeping with the analysis idea, I had expected this definition:
If $X$ is a random variable, given $\epsilon \ge 0$ and $A \subseteq \Omega$, there must be a $\delta$ such that $\int_{A} d\mu \le \delta$ and $\int_{A} X(\omega) d \omega \le \epsilon$.
Instead this is what I find in most places including Wiki (when $K \subset L^1(\mu)$):
$$\lim_{c \to \infty} \sup_{X \in K} \int_{|X|\ge c} |X| d \mu = 0$$
What is the reason for this different definition? Why does the bound on the value of $X$ even make an appearance?