0

Here are some general probability questions that have come to mind:

  • Say I have a random variable $Y_n$, and I know $Y_n$ converges in probability to a constant $m$. Under what conditions must a function $f$ satisfy so that $f(Y_n)$ converges in probability to $f(m)$ ? Are continuity and boundedness of $f$ both necessary to make this conclusion?
  • Here's a special case solution addressing the above situation that I've seen, which makes me believe continuity and boundedness of $f$ are not both necessary to make the conclusion, in general. Let $\{X_n\}$ be a sequence of i.i.d. random variables, with $X_1$ uniformly distributed on $(0,1)$. We need to prove that $Y_n = (\prod_{i=1}^n X_i)^{\frac{1}{n}}$ converges in probability to a constant $c$ as $n \longrightarrow \infty$, and find the constant $c$. Well, $ln(Y_n) = \bar{X}$ converges in probability to $E(X_1) = -1$ as $n \longrightarrow \infty$ by the Weak Law of Large Numbers $\Rightarrow$ $Y_n$ converges in probability to $e^{-1}$ as $n \longrightarrow \infty$. Yet, $e^x$ is only a continuous function, and not bounded.
  • Consider $\theta_n = \frac{n + 1}{\sum_{i= 1}^n X_i}$, where $X_1, ... , X_n$ are i.i.d. exponential random variables, each with parameter $\theta$. I'd like to show that $E(\theta_n)$ converges to $\theta$ as $n \longrightarrow \infty$. We can show that $E(\frac{1}{\theta_n})$ converges to $\frac{1}{\theta}$ without much trouble. From here, though, can we directly conclude that $E(\theta_n)$ converges to $\theta$ as $n \longrightarrow \infty$ ? In general, for what functions $f$ can I go from $E(Y) \longrightarrow m$ as $n \longrightarrow \infty$ to $E(f(Y)) \longrightarrow f(m)$ as $n \longrightarrow \infty$, where $Y$ is a random variable ? Here, the function at play is $f(x) = \frac{1}{x}$, which we know to be continuous at all but one point, yet not bounded.

Thank you for your time!

1 Answers1

1

The answer to your first question is affirmative. Here's a general statement:

Let $(X_n)$ be a sequence of real-valued random variables such that $X_n \overset{P}\to X$ for some random variable $X$. Let $f : \mathbb R \to \mathbb R$ be a measurable function.

If there exists a set $A \in \mathcal B(\mathbb R)$ such that $P(X \in A) = 1$ and $f$ is continuous on $A$, then $f(X_n) \overset P\to f(X)$.

Proof, kinda: First we note that convergence in probability is metrizable. Therefore it suffices to show that every subsequence $(f(X_{n_k}))_{k \in \mathbb N}$ of $(f(X_{n}))_{n \in \mathbb N}$ has a further subsequence $(f(X_{n_{k_p}}))_{p \in \mathbb N}$ such that $f(X_{n_{k_p}}) \overset P\to f(X)$. So let $(X_{n_k})$ be a subsequence of $(X_n)$. Clearly it still holds that $X_{n_k} \overset P\to X$, so we may take a further subsequence such that $X_{n_{k_p}} \overset{a.s.}\to X$. But then (since continuous mapping "clearly" holds for almost sure convergence under the above assumptions) $f(X_{n_{k_p}}) \overset{a.s.}\to f(X)$, and thus also $f(X_{n_{k_p}}) \overset{P}\to f(X)$. Therefore any subsequence of $(f(X_n))$ has a further subsequence which converges in probability to $X$, and so $f(X_n) \overset P\to f(X)$.