7

Consider iid random variables $(X_j)_{j\in\mathbb{N}_0}$ uniformly distributed on $[0,1]$. For $j\in\mathbb{N}$ define $V_j:=X_0X_j$ and the recursively defined estimator $W_j:=\max (W_{j-1},V_j)$ with $W_0:=0$.

I want to compute

1. $E[X_0|\mathcal{F}_j^V]$, the MMSE-estimator,

2. the mean square error $E[(X_0-W_j)^2]$,

3. the convergence rate of $E[(X_0-W_j)^2]$.

My attempt:

1. With this part, I struggle most. I have done the following:

From product distribution of two uniform distribution, what about 3 or more, it holds $f_{V_1}(x)=-\log(x)$. Then for $[a,b]\subset(0,1)$, I find

$$ \begin{aligned} E[X_01_{X_0\in[a,b]}] =\int_a^bxdx \end{aligned} $$

and

$$ \begin{aligned} E[-\frac{V_1}{\log(V_1)}1_{V_1\in[a,b]}]=\int_a^b-\frac{x}{\log(x)}(-\log(x))dx=\int_a^bxdx \end{aligned} $$

I am not sure, if this is even useful or how to connect the integrals, such that $1_A$ shows up on both sides for $A\in\sigma(V_1,\ldots, V_j)$. Does the MMSE-estimator even coincide with the conditional expectation?

2. With this part I am quite confident.

By observation, I find $$W_j=X_0\max (X_1,\ldots, X_j),$$ and by independence of $g(X_0)=X_0^2$ from $f(X_1,\ldots,X_j)=(1-\max(X_1,\ldots, X_n))^2$, I find $$E[(X_0-W_j)^2]=E[X_0^2(1-\max (X_1,\ldots, X_j))^2]=E[X_0^2]E[(1-\max (X_1,\ldots, X_j))^2].$$

Using Expected value of $\max\{X_1,\ldots,X_n\}$ where $X_i$ are iid uniform., I end up with

$$E[(X_0-W_j)^2]=C\frac{1}{(j+1)(j+2)}.$$

3. the rate of convergence is therefore $j^2$.

It would be great, if someone can check over it. I can add further details, if necessary!

Any hint or help is appreciated! Thank you in advance!

user408858
  • 2,463
  • 12
  • 28
  • For (1), your indicator $1_{[a,b]}$ is meaningless/ambiguous. More pointedly, in the same equation you have used it to mean both $1_{X_0 \in [a,b]}$ (alt., $1_{X_0^{-1}([a,b])}$) and $1_{V_1 \in [a,b]}$ (alt., $1_{V_1^{-1}([a,b])}$) . These are not the same at all. – Brian Moehring Oct 24 '23 at 21:57
  • Right, thank you for pointing that out! In this case, I most probably have to think about some other candidate. – user408858 Oct 24 '23 at 22:16

1 Answers1

4

The conditional expectation must be a function of $X_0X_1,...,X_0X_n$. In other words, there must exist a function $g(x_1,...,x_n):[0,1]^n\mapsto\mathbb{R}$ such that $$\mathbb{E}(X_0|X_0X_1,...,X_0X_n)=g(X_0X_1,...,X_0X_n)$$

and for all function $f(x_1,...,x_n) \in \mathcal{F}_n^V$, we must have:

$$\mathbb{E}(f(X_0X_1,...,X_0X_n)(X_0-g(X_0X_1,...,X_0X_n))=0 \tag{1}$$

We transform the LHS of $(1)$ by using conditional expectation and by interchanging the integral and expectation. $$\begin{align} 0 &= \mathbb{E}(f(X_0X_1,...,X_0X_n)(X_0-g(X_0X_1,...,X_0X_n))\\ &=\mathbb{E}(\color{blue}{\mathbb{E}(f(X_0X_1,...,X_0X_n)(X_0-g(X_0X_1,...,X_0X_n)|X_1,...,X_n)})\\ &=\mathbb{E}\left(\color{blue}{\int_0^1f(tX_1,...,tX_n)(t-g(tX_1,...,tX_n)|X_1,...,X_n)dt}\right)\\ &=\mathbb{E}\left(\int_0^1f(tX_1,...,tX_n)(t-g(tX_1,...,tX_n))dt\right)\\ &=\int_0^1\mathbb{E}\left(f(tX_1,...,tX_n)(t-g(tX_1,...,tX_n))\right)dt \hspace{0.3cm}\text{interchange integral and expectation}\\ &=\int_0^1\left(\int_{\{(x_i)_{i=1,...,n} \in[0,1]^n \}}f(tx_1,...,tx_n)(t-g(tx_1,...,tx_n))dx_1..dx_n\right)dt\\ &=\int_0^1\left(\frac{1}{t^n}\int_{\{(x_i)_{i=1,...,n} \in[0,1]^n \}}f(tx_1,...,tx_n)(t-g(tx_1,...,tx_n))d(tx_1)..d(tx_n)\right)dt\tag{2} \end{align} $$ Make a change of variable $(y_1,...,y_n) = (tx_1,...,tx_n)$ in the inner integral of $(2)$, we have: $$\begin{align} 0 &=\int_0^1\left(\frac{1}{t^n}\int_{\{(y_i)_{i=1,...,n} \in[0,t]^n \}}f(y_1,...,y_n)(t-g(y_1,...,y_n))dy_1..dy_n\right)dt\\ &=\int_0^1\left(\int_{\{(y_i)_{i=1,...,n} \in[0,1]^n \}}\left(\frac{1}{t^n}\left( \prod_{i=1}^n \mathbf{1}_{\{y_i \le t \}}\right)f(y_1,...,y_n)(t-g(y_1,...,y_n))\right)dy_1..dy_n\right)dt\\ &=\int_{\{(y_i)_{i=1,...,n} \in[0,1]^n \}}\left(\int_0^1\left(\frac{1}{t^n}\left( \prod_{i=1}^n \mathbf{1}_{\{y_i \le t \}}\right)f(y_1,...,y_n)(t-g(y_1,...,y_n))\right)dt\right)dy_1..dy_n \\ &=\int_{\{(y_i)_{i=1,...,n} \in[0,1]^n \}}f(y_1,...,y_n)\left(\color{red}{\int_0^1\left(\frac{1}{t^n}\left( \prod_{i=1}^n \mathbf{1}_{\{y_i \le t \}}\right)(t-g(y_1,...,y_n))\right)dt}\right)dy_1..dy_n \tag{3} \end{align}$$

$(3)$ holds true for all function $f$, so we must have for all $(y_1,...,y_n)\in [0,1]^n$ : $$\int_0^1\left(\frac{1}{t^n}\left( \prod_{i=1}^n \mathbf{1}_{\{y_i \le t \}}\right)(t-g(y_1,...,y_n))\right)dt=0 \tag{4}$$

We notice that that $\prod_{i=1}^n \mathbf{1}_{\{y_i \le t \}} = \mathbf{1}_{\{t \ge \max{\{y_1,...,y_n\}} \}}$. From $(4)$, we have: $$\begin{align} 0&=\int_0^1\left(\frac{1}{t^n}\left( \prod_{i=1}^n \mathbf{1}_{\{y_i \le t \}}\right)(t-g(y_1,...,y_n))\right)dt\\ &=\int_0^1\mathbf{1}_{\{t \ge \max{\{y_1,...,y_n\}} \}}\left(\frac{1}{t^n}(t-g(y_1,...,y_n))\right)dt\\ &=\int_{\max{\{y_1,...,y_n\}}}^1\left(\frac{1}{t^{n-1}}-\frac{1}{t^{n}}g(y_1,...,y_n)\right)dt\\ &=\int_{\max{\{y_1,...,y_n\}}}^1\left(\frac{1}{t^{n-1}}-\frac{1}{t^{n}}g(y_1,...,y_n)\right)dt \tag{5} \end{align}$$

From $(5)$, we deduce that: $$\color{red}{g(y_1,...,y_n) = \frac{\int_{\max{y_1,...,y_n}}^1 t^{-(n-1)}dt}{\int_{\max{y_1,...,y_n}}^1 t^{-n}dt} = \frac{n-1}{n-2}\cdot \frac{\left( \max{\{y_1,...,y_n\}}\right)^{2-n} -1 }{\left( \max{\{y_1,...,y_n\}}\right)^{1-n} -1} \tag{6}} $$

We can conclude that: $$\mathbb{E}(X_0|X_0X_1,...,X_0X_n)=g(X_0X_1,...,X_0X_n)$$ where $g$ is defined by $\color{red}{(6)}$.

NN2
  • 15,892
  • Thank you for your answer! What do you exactly mean with $f(x_1,\ldots, x_n)\in \mathcal{F}_n^V$? – user408858 Oct 25 '23 at 00:33
  • Oh, you mean any $\sigma(X_0X_1,\ldots, X_0X_n)$-measurable random variable $Z$ can be written in the form $Z=f(X_0X_1,\ldots, X_0X_n)$ for some measurable $f:\mathbb{R}^n\rightarrow \mathbb{R}$? – user408858 Oct 25 '23 at 00:52
  • 1
    @user408858 exactly – NN2 Oct 25 '23 at 04:09