I am not very good with probability theory and related stuff, so I would very much appreciate your help regarding a problem I have.
- Imagine arbitrarily picking an integer $n\in [1,N]$
- I want to guess $n$ by running a random number generator several times
- The random number generator spits out numbers $\in [1,N]$, with uniform distribution
- The random number generator does not keep track of previous guesses (i.e. can use the same guess more than once)
My question is: how can I estimate how many times do I have to run the random number generator, on average, before hitting the chosen number $n$?
I searched for an answer and found other posts, such as this one "Average number of guesses to guess number between 1 and 1000?", but there it was possible to get hints (higher/lower), so it's not really the same thing.
Thank you so much
EDIT: I wrote a python script to do some experiments and get some intuition. Looks like on average it takes $N$ guesses, but I am not sure whether this holds in general, and why this is the case.
Here is the script:
import random
import numpy
noTests = 100000
maxNumber = 100
noGuesses = []
targetNo = random.randint(0, maxNumber)
for i in range(1, noTests):
guessNo = 1
while True:
if random.randint(0, maxNumber) == targetNo:
break
guessNo = guessNo+1
noGuesses.append(guessNo)
avg = numpy.mean(noGuesses)
print "Average number of guesses to guess a number in [0,%d] is %d"%(maxNumber,avg)
print "Estimated over %d tests"%(noTests)