1

Assume I take two real numbers in the range $[a,b]$ where $a, b\in \mathbb R$ and $a < b$, such that both of them are distributed randomly. I then drop the lowest of these two. What is the expected value?

All of the approaches I've seen to the version of this problem where $a, b\in \mathbb N$ have been brute-forced. A piece of software that calculates a similar scenario assumes the answer is $\frac{a}{3}+\frac{2*b}{3}$, but after trying to simulate it on my own a bunch I'm convinced that that answer is slightly off.

OrdiNeu
  • 11
  • Apologies if this question is too basic or my terminology is off -- I've forgotten most of my math classes from undergrad. Unsure if this is proper etiquette here, but the best related question I found is: Expectation of minimum of two random numbers, which is for natural numbers and fixes a at 1. – OrdiNeu Sep 08 '23 at 20:27
  • I think what you are looking for is the expected value of the maximum of two independent uniformly distributed random variables (i.e. $\mathbb{E}[\max{X,Y}]$ with $X,Y \sim U(a,b)$). If so, this post should answer your question. – GraffL Sep 08 '23 at 20:41
  • If you can solve the problem for uniformly sampled interval $[0,1]$, the expected maximum of two numbers can be found for any finite interval. It isn't clear what being "brute-forced" means to you, but a straightforward derivation would involve integrating over the unit square. – hardmath Sep 08 '23 at 20:42
  • @GraffL ah yup that's the answer I was looking for. Thanks! I guess there must be something wrong with my code -- my numerical approach for $a=1, b=20$ came out to something close but kept on being above $13.\overline{3}$, closer to $13.6$. @ hardmath Oh, I see. However, I wasn't sure how to solve it over uniformly sampled interval $[0, 1]$ either. – OrdiNeu Sep 08 '23 at 21:16

0 Answers0