I have a given function $f(x,\omega)\geq 0$, where $x\in\mathbb{R}$ is a parameter and $\omega\in\Omega$, some sampling space. $f(x,\omega)$ is increasing in $x$ for any $\omega$. I am interested in showing that
$$\lim_{x\to\infty}\int_\Omega f(x,\omega)d\omega=\int_\Omega \lim_{x\to\infty} f(x,\omega)d\omega.$$
What I have done thus far:
Pick any sequence $\{x_n\}_{n\in\mathbb{N}}$, such that $x_n\uparrow\infty$ when $n\to\infty$. Define $f_n(\omega)\equiv f(x_n,\omega)$ for every $n\in\mathbb{N}$, and $f(\omega)\equiv \lim_{n\to\infty}f_n(\omega)=\lim_{x\to\infty}f(x,\omega)$. Then, as $f_n(\omega)\geq 0$ and $f_n(\omega)\uparrow f(\omega)$, the Monotone Convergence Theorem implies that
$$\lim_{n\to\infty}\int_\Omega f_n(\omega)d\omega=\int_\Omega f(\omega)d\omega.$$
What I don't know is how to go back from having the MCT working for every sequence $\{x_n\}_{n\in\mathbb{N}}$, such that $x_n\uparrow\infty$ when $n\to\infty$, to working for $x\to\infty$, as the latter technically is an uncountable index. I found this post but I find the answers highly unprecise with the details. Any help is greatly appreciated.