The algorithm is as follows:
a = rand % a random number between 0 and 1
b = a
while b == a
b = rand
end
Here rand
is a function that returns a random number, generated uniformly, between 0 and 1. Let us say that this is the MATLAB function rand.
What is the time complexity of this algorithm?
It looks like the best and average complexities are $O(1)$ and the worst complexity is unbounded.
rand
do? In most actual programming languages, it gives a pseudorandom number so there are actually deterministic bounds on exactly how long it will take to get back to the initial value. – David Richerby Jun 23 '16 at 15:05