This is not homework. I was reading a paper where the authors showed a result for all continuous functions and then just proceeded to write "the usual limiting Argument gives the result for all bounded functions" - so I am asking myself what this "usual limiting argument" might be. I do not know whether they mean uniform or pointwise convergence. As I see it pointwise convergence should suffice :D
Thus I was am wondering whether there is a theorem having or leading to the following statement:
Let $K\subset\mathbb{R}^2$ be compact. Any bounded measurable function $f:K\to\mathbb{R} $ can be approximated by a sequence of continuous functions $(g_m)$ on $K$.
Nate Eldredge suggested that I post some excerpt from the original to provide more context for the problem. Here I go:
The goal is to proof the existence of a weak limit for a tight sequence of probability measures on $\mathcal{C}^0([0,1]^2,\mathbb{R})$ associated with reflecting Brownian Motions on the compact the set $[0,1]^2$ which is a Lipshitz Domain. Thus we already, know that some weak limit must exist and it remains to show that two limit-Points agree. Weak-Convergence is generally defined via bounded measurable functions. Now let $P'$ and $P''$ be two subsequential limit points. The authors show that $f \in \mathcal{C}^0([0,1]^2,\mathbb{R})$ the following holds (here $X_s$ the canonical process)
$E'f(X_s)=E''f(X_s)$
And now comes the actual source of my question: "The usual limiting argument gives the result for bounded $f$ and hence the one-dimensional distributions agree." (the second part I understand only the "standard limiting argument thing" is somewhat confusing)
Any Help is much appreciated and Thanks in Advance :D