I am running a program and its runtime follows a normal distribution $T_i = N(t,\sigma).$ Now, I am running $n$ programs in parallel ($T_1, T_2, \cdots, T_n$ are i.i.d.), and the finish time for $n$ program is finished when the last one finished - $T' = \max(T_1, T_2, T_3,\cdots, T_n)$. What is the expectation for $T'$? I am asking this because I encounter this problem in my computational works.
Asked
Active
Viewed 28 times
0
-
Does this answer your question: https://math.stackexchange.com/questions/89030/expectation-of-the-maximum-of-gaussian-random-variables – VTand May 05 '23 at 04:30
-
That's what I am looking for. In my case the runtime would be bounded by $\mathbb{E}(T'-t+t) = (\mathbb{E}T'-t)+t \leq \sigma \sqrt{ 2 \log N}+t.$ Since t is positive. – Eleven Chen May 05 '23 at 06:49