1

For this question, let us define "closed form" as an expression restricted to addition, subtraction, multiplication, and division; exponents and logarithms, including $e^x$ and $\ln{x}$; trigonometric functions and inverse trigonometric functions. Using an infinite number of terms or operations is not permitted.

True or false: For continuous random variable $X$, if $E(X)$ has a closed form, then $P(X<E(X))$ has a closed form.

I don't see why this would be true, but I can't think of a counter-example.

Context: $(1+u_1)(1+u_1 u_2)(1+u_1 u_2 u_3)...$, where the $u$'s are i.i.d. $\text{Uniform}(0,1)$-variables, has a closed form expectation, $e$. I have tried unsuccessfully to find the probability that it is greater than its expectation. I wonder, since the expectation has a closed form, should I, or shouldn't I, expect a closed form for the probability? (My definition of closed form does not allow limits, but admits $e$ as a closed form; this is not a contradiction.)

Dan
  • 22,158
  • 2
    Even for finite distributions this seems like a big ask. The expected number of fixed points for a (uniformly) random permutation on $n$ letters is $1$, but the probability of $0$ fixed points is the ratio of the number of derangements to the total number...hardly a pleasant expression. – lulu Feb 23 '23 at 10:33
  • @lulu I thought in your example $P(X < E[X]) = P(X=0) = \dfrac{[n!/e]}{n!}$ where $[x]$ is rounding to the nearest integer. – Henry Feb 23 '23 at 11:32
  • @Henry Sure...does that count as a closed form? – lulu Feb 23 '23 at 11:34
  • @lulu I have no idea – Henry Feb 23 '23 at 11:34

2 Answers2

5

This is my best attempt at a counter example. I do not have one for your specific example, but as was already explained, no closed expression proofs are really, really hard. I think you won't come any further than a proof like this, in which you convert the probability to a term with known lack of closed-form expression.

Take a standard Log-normal distribution, which has mean $E(X) = e + \frac{1}{2}$, which is closed-form according to your definition. Since $X$ is standard LN distributed, we can define a standard Gaussian distributed variable via $Y = \log X$.

Now, $P(X < e + \frac{1}{2}) = P(e^Y < e+\frac{1}{2}) = P(Y < \log(e + \frac{1}{2})) = erf(e + \frac{1}{2})$, which a non-trivial instantiation of the error function, and hence has no closed-form expressions (from how I remember it, with the exception of $erf(0)$, no input with a closed-form expression has a closed-form output).

student13
  • 146
  • This answers my question (+1), thanks. (I'm not quite sure why you refer to it as a "best attempt", as if it could be better or something.) – Dan Feb 23 '23 at 12:31
-1

(self-answering)

Here is a counter-example.

Let $X=\sum\limits_{k=1}^\infty \left(\prod\limits_{i=1}^k a_i \right)$ where $a_i$ are i.i.d. uniform$(0,1)$-variables.

$E(X)=\sum\limits_{k=1}^\infty \left(\prod\limits_{i=1}^k E(a_i) \right)=\sum\limits_{k=1}^\infty \left(\prod\limits_{i=1}^k \frac12 \right)=1$

It can be shown that $P(X<E(X)=1-e^\gamma$ where $\gamma$ is the Euler-Mascheroni constant, which has no closed form according to the definition in the OP.

(In the OP, my definition of closed form excludes $1-e^\gamma$, but normally I would consider $1-e^\gamma$ to be a closed form, which is probably why this counter-example didn't come to mind initially.)

Dan
  • 22,158