0

I learnt the converge definition in advanced calculus:

For any $\epsilon>0$, there is a $N$, such that for all $n>N$, we have $|X_n-x|<\epsilon$ then we say $X_n$ converge to $x$.

Then I learnt the consistent estimators in statistics:

For every $\epsilon>0$ and every $\theta \in \Theta$, $\lim\limits_{n\to\infty}P_\theta(|W_n-\theta| \ge \epsilon) = 0$

So my question is that is there any relationship between consistent estimators and converge sequence?

Are the two definition equivalent?

I believe the two definition must have something in common, but I can not find that.

And also , Is there any relationship between converge in distribution[$F_{X_n}\to F_x$] and the definition of continuity in function? [$X_n\to x_0, f(X_n) \to f(x_0)]$

  • Convergence of sequences and convergence of functions (one example being estimators) work differently. Both still mean that things get closer and closer, but for numerical sequences they get closer to a single point; for functions many points get closer to other many points at the same time, and there are various ways that those points of a function can get closer and closer. – AspiringMathematician Mar 24 '19 at 16:31
  • There are a about 500 (I‘m exaggerating of course) different definitions for the convergence of random variables (I really recommend you to read https://en.m.wikipedia.org/wiki/Convergence_of_random_variables). Currently you are talking about convergence in probability. Note that the convergence in real analysis deals with numbers, whereas the convergence terms in probability theory deal with random variables. So many of the convergence terms are quite different. – Maximilian Janisch Mar 24 '19 at 16:31
  • Have a look at wikipedia. Be aware that random variables are actually functions (not just numbers). – drhab Mar 24 '19 at 16:33
  • I see, so basically, the main difference is between function and the single value – MolinYue Mar 25 '19 at 15:30

1 Answers1

0

Perhaps this rephrasing will make the parallel more clear:

$X_n\to X$ in probability $\iff$ for any $\epsilon>0$, there is an $N$ so $n\ge N$ implies $E\left[\frac{|X_n-X|}{1+|X_n-X|}\right]<\epsilon.$

For a proof, see convergence in probability induced by a metric.

In other words, if we define $d_P(X,Y)$ to be the funny quantity $E\left[\frac{|X-Y|}{1+|X-Y|}\right]$, then $d_P(X_n,X)$ replaces $|X_n-X|$ in the usual definition of convergence. More generally, in any metric space with a distance function $d$, we have the following notion of convergence:

$x_n\to x$ $\iff$ for any $\epsilon>0$, there is an $N$ so $n\ge N$ implies $d(x_n,x)<\epsilon$.

Your calculus notion of convergence is the one from the metric space $\mathbb R$ with distance function $d(x,y)=|x-y|$, while convergence in probability comes $d_P$.

Finally, there is a relationship between convergence and distribution and metric convergence. Let $X_n$ have cdf $F_n$, and let $X$ have cdf $F$.

$X_n\to X$ in distribution $\iff$ for all $x$ such that $F$ is continuous at $x$, we have $F_n(x)\to F(x)$.

Also, convergence in distribution is induced by a metric in the same way convergence in probability is. In this case, the metric is $$ d_\text{Levy}(X,Y)=\inf\{\epsilon>0:P(X\le x-\epsilon)-\epsilon<P(Y\le x)\le P(X\le x+\epsilon)+\epsilon\text{ for all }x\in \mathbb R\} $$ See https://en.wikipedia.org/wiki/L%C3%A9vy_metric.

Mike Earnest
  • 75,930