I'm wondering if somebody could explain to me how the way my professor solved the number of coin flips until you get heads worked.
He goes as follows:
Let N = number of coin flips until you get head He then defines E[N] in terms of itself.
E[N] = 1 + 1/2 * E[N]
He says that the 1 comes from the fact that you for sure need at least one coin flip (the first flip), so that I understand. Then he says that the 1/2 represents the probability of tails and that E[N] (the one on the right side of the equal sign) represents the number of further coin flips needed in this case. He said that this is a "memory-less process, because when you start anew on the second coin flip having gotten to tails, it's as if you're in time 1 all over again." I....don't really understand. Why exactly does 1/2 have to be multiplied to E[N], and...why exactly does the equation defined in terms of itself like that work? Why are we allowed to do this?
I actually somehow found the proof in this answer (Expected value of the number of flips until the first head) better to understand than whatever my professor was trying to do.