1

I am working with the set of i.i.d discrete random variables $\{\zeta_1, \ldots, \zeta_n\}$. Each one of them can take either of the $m$ values $\{z_1, \ldots, z_m\}$ with corresponding probabilities $\{p_1, \ldots , p_m\}$.

I am trying to understand when I can apply the following approximation for the expectation (which I believe to be the first-order one): \begin{equation} \mathbb{E}\left[\min\left(A, \frac{1}{\sum_{k=1}^n \mathrm{I}[\zeta_k\geq \bar{z}]}\right)\right] \approx \min\left(A, \frac{1}{\mathbb{E}\left[\sum_{k=1}^n \mathrm{I}[\zeta_k\geq \bar{z}]\right]}\right) \end{equation}

Here $A$ and $\bar{z}$-- are some constants and an indicator function $\mathrm{I}[\zeta_k\geq \bar{z}]=1$ if $\zeta_k\geq \bar{z}$ and $0$ otherwise.

I am working in the regime $n\to\infty$ and the question is whether the approximation above is good enough under this condition? (if it is possible to judge, though)

I checked the following link, however I am not sure how to make use of $\mathcal{L}_{X(t)}^n$ under large $n$ in the case of discrete distribution.

Thank you in advance for any help.

mathisfun
  • 319

1 Answers1

2

Let $p=P(\zeta\ge z)$. Then as $n\to\infty$ $$LHS\approx E(\frac 1 {B(n,p)};B(n,p)\ge 1)\approx \frac 1 {np}+\frac {np(1-p)}{(np)^3}=\frac 1 {np}+\frac {1-p}{(np)^2}=\frac 1{np}(1+\frac{1-p}{np})$$ while $RHS=\frac 1 {np}$. Hence $LHS\sim RHS$.

A.S.
  • 4,004
  • Nice one. Can you elaborate where the second-order term of E[1/B] comes from? – Justpassingby Dec 02 '15 at 17:43
  • $E(g(X))\approx g(\mu)+\frac 1 2g''(\mu)\sigma^2$ – A.S. Dec 02 '15 at 17:45
  • @A.S. that looks nice, thank you. Could you explain how did you deal with $A$ and the $\min$? – mathisfun Dec 02 '15 at 18:40
  • @A.S. in general, I think $\min$ makes the LHS non-differentiable. Thus, Taylor expansion is problematic. Though, I did some simulation and RHS indeed well approximates LHS. – mathisfun Dec 02 '15 at 22:29
  • @math $P(B(n,p)<\frac 1 A)\to 0$ so you can choose arbitrary $A$ as it won't affect asymptotic behaviour - so choose $1$. – A.S. Dec 03 '15 at 00:52
  • @A.S. but what if $A$ is very small, then there is quite a lot of weight in this probability? Or I don't understand something? – mathisfun Dec 03 '15 at 09:47
  • @math It doesn't matter how large fixed $1/A$ is. $P(B(n,p)<\frac 1 A)\to 0$ for all $A>0$. – A.S. Dec 03 '15 at 09:54
  • Thank you @A.S. Do I understand correctly that the partial expectation that we have in the LHS $\mathrm{E}[\frac{1}{B(n,p)}; B(n,p)\geq 1/A]$ is also in the limit of $n\to\infty$ doesn't depend on $A$? – mathisfun Dec 03 '15 at 10:21
  • Yes. If $1/A<1$ you can instead use condition $B(n,p)\ge 1$ as I did above. – A.S. Dec 03 '15 at 10:24