12

Is there any way I can calculate the expected value of geometric distribution without diffrentiation? All other ways I saw here have diffrentiation in them.

Thanks in advance!

user21312
  • 966
  • 1
    You are basically asking how to calculate $\sum_{k=1}^\infty x^{k-1}\cdot k$ without using differentiation term by term of $\sum x^k$. –  Jul 22 '17 at 18:45
  • 2
    Can you clarify what you mean by a Geometric distribution? Say I am trying to toss a coin until I get an $H$. If my sequence is $TTH$ do you see the length as $2$ or $3$? – lulu Jul 22 '17 at 19:06
  • @lulu : "Geometric distribution" can mean the distribution of the number of independent trials needed to get one success, with probability $p$ of success on each trial, or sometimes with probability $p$ of failure on each trial (so the probability of success is $1-p$). In either case it is a distribution supported on the set ${1,2,3,4,\ldots,}.$ But it can also mean the distribution of the number of failures before the first success, so that it's supported on the set ${0,1,2,3,4,\ldots}. \qquad$ – Michael Hardy Jul 22 '17 at 19:41
  • @lulu It would be better to have some analytical expressions which clarifies the post. – Felix Marin Jul 22 '17 at 21:10
  • https://math.stackexchange.com/q/605083/321264, https://math.stackexchange.com/q/30732/321264. – StubbornAtom Jun 02 '20 at 08:22
  • https://math.stackexchange.com/q/1299465/321264, https://math.stackexchange.com/q/301751/321264, https://math.stackexchange.com/q/1426233/321264 – StubbornAtom Jun 03 '20 at 08:02

5 Answers5

20

The equivalent question outlined in the comments is to find the value of $$S = \sum_{k=1}^\infty kx^{k-1}$$

We can write out the first few terms:

$$S = 1 + 2x + 3x^2 + 4x^3 + 5x^4 + 6x^5 + \cdots$$

Multiply by $x$ to get

$$xS = 0 + x + 2x^2 + 3x^3 + 4x^4 + 5x^5 + \cdots$$

Now subtract $xS$ from $S$:

$$S - xS = 1 + x + x^2 + x^3 + x^4 + x^5 + \cdots$$

The right hand side is a standard geometric series, which means, when $|x|<1$,

$$S - xS = \frac{1}{1-x}$$

$$S(1-x) = \frac{1}{1-x}$$

$$ S = \boxed{\frac{1}{\left(1-x\right)^2}}$$

The original series' terms are an arithmetico-geometric sequence, and this trick of multiplying by the common ratio and subtracting can be used for many similar series.

5

The geometric distribution is memoryless so either you succeed in the initial attempt with probability $p$ or you start again with probability $1-p$ having made a failed attempt,

if the succeeding on the first attempt counts as $1$ attempt:

$$E[X]=p\times 1+(1-p)\times (1+E[X])$$ so $$p\times E[X]=1$$ so $$E[X]=\frac{1}{p} \text{ attempts}$$

while if succeeding on the first attempt counts as $0$ failures:

$$E[X]=p\times 0+(1-p)\times (1+E[X])$$ so $$p\times E[X]=1-p$$ so $$E[X]=\frac{1-p}{p} \text{ failures}$$

and naturally $\frac1p = \frac{1-p}p +1$ since you stop at the first successful attempt

Henry
  • 157,058
3

$$\Pr(X=x)=p(1-p)^{x-1},x\in\{1,2,3,\cdots\}$$ $$\mu_X=\sum_{x=1}^{\infty}x\ p(1-p)^{x-1}$$ changing variable $1-p=q$: $$\mu_X=\sum_{x=1}^{\infty}x\ (1-q)q^{x-1}=\sum_{x=1}^{\infty}x\ q^{x-1}-x\ q^{x}=\sum_{x=0}^{\infty}(x+1)\ q^{x}-x\ q^{x}=\sum_{x=0}^{\infty}q^{x}=\frac{1}{1-q}=\frac1p$$

msm
  • 7,147
3

I know at least two ways off hand and there are probably others.

First I'll show you a concrete way to do it. After that I'll show you how to express the same thing exactly. (Together these make up only one of those "two ways". The other one now appears in the answer posted by "Henry".) $$ \begin{array}{cccccccccccccccccccccccc} & 0 & & 1 & & 2 & & 3 & & 4 & & 5 & & 6 \\ \hline & & & p^1 & + & 2p^2 & + & 3p^3 & + & 4p^4 & + & 5p^5 & + & 6p^6 & + & \cdots & {} \\[12pt] = & & & p^1 & + & p^2 & + & p^3 & + & p^4 & + & p^5 & + & p^6 & + & \cdots \\ & & & & + & p^2 & + & p^3 & + & p^4 & + & p^5 & + & p^6 & + & \cdots \\ & & & & & & + & p^3 & + & p^4 & + & p^5 & + & p^6 & + & \cdots \\ & & & & & & & & + & p^4 & + & p^5 & + & p^6 & + & \cdots \\ & & & & & & & & & & + & p^5 & + & p^6 & + & \cdots \\ & & & & & & & & & & & & + & p^6 & + & \cdots \\ & & & & & & & & & & & & & & + & \cdots \\ & & & & & & & & & & & & & & \vdots \end{array} $$ First sum each (horizontal) row. Each is a geometric series. Then sum the remaining series, which is also geometric.

Here is the same method expressed abstractly in the language of algebra: \begin{align} \sum_{x=0}^\infty x (1-p) p^x & = \sum_{x=1}^\infty x (1-p) p^x = \sum_{x=1}^\infty \sum_{j=1}^x (1-p)p^x \\[10pt] & = \sum_{ x,j\, : \, 1 \,\le\, j \, \le \, x} (1-p) p^x = \sum_{j=1}^\infty \sum_{x=j}^\infty (1-p)p^x \end{align} Now you're summing a geometric series as $x$ goes from $j-1$ to $\infty,$ and then the outer sum, as $j$ goes from $1$ to $\infty,$ also turns out to be geometric.

(In the very first step above I put $\displaystyle\sum_{x=0}^\infty = \sum_{x=1}^\infty.$ That is justified by the fact that when $x=0,$ the actual term being added is $0$ so it can be dropped.)

1

The problem can be viewed in a different perspective to understand more intuitively. Let's see the following definition.

"A person tosses a coin, if head comes he stops, else he passes the coin to the next person. For the next person, he follows the same process: If head comes he stops, else passes the coin to next person and so on."

So the process can be modelled as,

$$ X = \begin{cases} 1, & \text{if $head$ occurs} \\ 1 + Y, & \text{if $tail$ occurs} \end{cases} $$

where $Y$ denotes the next person.

Similarly for $Y$, the equation can be written as, $$ Y = \begin{cases} 1, & \text{if $head$ occurs} \\ 1 + Y', & \text{if $tail$ occurs} \end{cases} $$ where $Y'$ is the next after $Y$ and so on.

We see that there is no difference between $X$ and $Y$, both toss the coin and stop if head comes else pass the coin to next person who does the same. If both the events were occurring separately and independently (i.e. $Y$ had a coin already and didn't know it came from $X$), their average (or expected) values would be same.

Let's calculate average value of $X$ using the formula of expected values. Here $p$ is the probability of head and $q$ is the probability of tail. $$ E(X) = p \times 1 + q \times (1 + Y) \tag{1}\label{eq1} $$

Let's see the formula in an intuitive way. Suppose we ask $X$, What is the average number of tosses that you need to get heads?. He will be like, My expected value is 1 into p if head occurs in the first toss else (1 + no. of tosses that $Y$ had to make) into q. And $Y$ is like so on an average I had to do $E(Y)$ tosses to get heads. So $X$'s final statement is 1 into p if head comes in first toss else 1 + $E(Y)$ tosses tail comes.

So equation can be reduced as,

$$ \begin{align} &E(X) = p \times 1 + q \times (1 + E(Y)) && \text{} \tag{2}\label{eq2} \\ \Rightarrow \ &E(X) = p \times 1 + q \times (1 + E(X)) && \text{Since $E(X) = E(Y)$} \\ \Rightarrow \ &E(X) = 1/p \end{align} $$


Note that, $\eqref{eq1}$ is very different from $\eqref{eq2}$.
$(2)$ can be solved as seen above, but $(1)$ just continues to unfold further like $E(X) = 1 \times p + q \times (1 + 1) \times p + q \times (1 + Y') \times q$ and so on. Although this can also be reduced to $E(X) = 1 \times p + q \times (1 + E(X))$, but I wanted to follow the process intuitively.


Since it can seem off, to just put the average value of $Y$ in $(1)$, improvements with strong reasoning are welcomed.

Dhruv
  • 11
  • 1