1

Given an upper bound for $E \left[ \exp \left(\frac{nt}{\sum_{i=1}^n k_i}\right)\right]$ where

  • $k_i$'s are random variables which denote the number of independent Bernoulli trials before we encounter the first failure where probability of failure is $p$
  • $n$ is the number of such Bernoulli trial experiments performed
  • $E[X]$ denotes the expected value of a random variable $X$
  • $e$ = $exp$ is the base of the natural logarithm
  • $t > 0$ is some arbitrary parameter

What I have done so far:

My original task was to derive an upper bound for

$$Pr \left( \frac{n}{ \sum_{i=1}^n k_i} - p \ge \delta \right)$$

$$ = Pr \left( e^{t \frac{n}{\sum_{i=1}^n k_i} - p } \ge e^{t\delta} \right)$$

Here $t > 0$. I applied Markov's Inequality to get

$$Pr \left( e^{t \frac{n}{\sum_{i=1}^n k_i} - p } \ge e^{t\delta} \right)$$

$$ \le \frac{E\left[e^{t \frac{n}{\sum_{i=1}^n k_i}}\right]}{e^{t(\delta + p)}}$$

Thus to continue from here, I have to compute $E \left[ exp \left(\frac{nt}{\sum_{i=1}^n k_i}\right)\right]$ I attempted the following steps:

$$\frac{n}{\sum_{i=1}^{n} k_i} \le \sum_{i=1}^{n}\frac{1}{k_i}$$ $$\to E[exp(\frac{nt}{\sum_{i=1}^n k_i})] \le E[e^{t\sum_{i=1}^{n}\frac{1}{k_i}}]$$ $$ = \prod_{i=1}^{n} E[e^{\frac{t}{k_i}}]$$

Now, I focussed on computing $E[e^{\frac{t}{k_i}}]$ which can be written as $$p \sum_{i=0}^{\infty}(1-p)^ie^{\frac{t}{i+1}}$$

And this is where I am stuck. I tried to see if I could do something with the Taylor Series' Expansion of $e^x$ do get some G.P. of the overall some but without luck.

epimorphic
  • 3,219
  • For a crude upper bound, the expected value is at most the largest possible value. Since $k_i \ge 1$, that is $\exp(t)$. Do you need something better than that? – Robert Israel Mar 02 '16 at 22:18
  • Yes. I am hoping that I get a bound in terms of $n,t$ and $p$ – Banach Tarski Mar 02 '16 at 22:19
  • Since $e^{t/k_i} > 1$, your product will be greater than $\exp(t)$ if $n$ is large. There's no point in finding a bound depending on $n$ if it's worse than a bound that doesn't depend on $n$. – Robert Israel Mar 02 '16 at 22:31
  • Actually there is a point. You see $t$ is actually arbitrary with the constraint that $t > 0$. If I get some exponential bound in terms of $n,t$ and $p$, I can minimize this quantity by finding a minima for the exponent w.r.t. $t$. – Banach Tarski Mar 02 '16 at 22:37
  • Taylor expand $g(x)=e^{t/x}$ around $x=\frac 1{1-p}$ for small $t$ we get: $$=E\exp\left(\frac t{1+\frac {NB(n,p)}n}\right)\le e^{(1-p)t}+\frac 1 22te^{(1-p)t}(\frac 1{1-p}+\frac t 2)(1-p)^4\frac {pr}{n^2(1-p)^2}$$ after noting that $g^{(3)}<0$ and $\mu_3>0$. Large $t$ can be treated separately basically giving the bound mentioned by Robert times a prefactor in $n,p$. – A.S. Mar 02 '16 at 22:41

1 Answers1

0

Let me write $\sum_{i=1}^n k_i = S_n$. $E(k_i) = 1/p$, and $\exp(t/x)$ is a convex function so $E[\exp(nt/S_n)] \ge \exp(tp)$ by Jensen. You can't do any better than that for an upper bound. On the other hand, for $0 < \alpha < 1$, the theory of large deviations gives you a function $I(\alpha) > 0$ and an asymptotic formula $$\frac{\log P(S_n < \alpha n/p)}{n} \to -I(\alpha) $$ so that $$\eqalign{E[\exp(nt/S_n)] &\le e^{tp/\alpha} + P(S_n < \alpha n/p) (e^t - e^{tp/\alpha})\cr &\sim e^{tp/\alpha} + e^{-n I(\alpha)}(e^t - e^{tp/\alpha})}$$

For a more explicit inequality you might try Chernoff bounds.

Robert Israel
  • 448,999