0

Let (M,W) be a continous probability space (take M=(a,b)), and let $x:M \rightarrow \mathbb{R} $ be a random variable which is bounded. Prove that

$ \inf_{t\in M} x(t) \le E(x) \le \sup_{t \in M} x(t)$

Hello, We only get a quick introduction on elements of probabilities for our modelling course.We say that W have two properties: $-)W(M)=1$, $-)$if $A \cap B = \emptyset \Rightarrow W(A \cup B)=W(A)+W(B) $. As a example of continous probability space we consider the case for $M=(a,b)$. In this case the probability density function (when it exists) is definied as $f(t):= \lim_{\Delta t \rightarrow 0} \frac{W(t, t + \Delta t))}{\Delta t} $ and for a random real variable $X:M \rightarrow \mathbb{R}, x=x(t)$ we define the expected value of x when the density function exists as $E(x)=\int_a^b x(t) f(t) dt $

Back to the exercise: $\exists K \ge 0: -K \le x(t) \le K \forall t \in M$

It follows that $-K \le \inf_{t \in M} x(t) \le x(t) \le \sup_{t\in M} x(t) \le K \forall t \in M $

So $\inf_{t \in M} x(t) \int_a^b\lim_{\Delta t \rightarrow 0} \frac{W(t, t + \Delta t))}{\Delta t}dt \le E(X) \le \sup_{t\in M} x(t)\int_a^b\lim_{\Delta t \rightarrow 0} \frac{W(t, t + \Delta t))}{\Delta t}dt$

We don´t discuss that the integral of the density function should be 1 on the whole intervall, we only defined it as above.

Now i must prove: $\int_a^b\lim_{\Delta t \rightarrow 0} \frac{W(t, t + \Delta t))}{\Delta t}dt=1$

For this i want to use the fundamental theorem of calculus. $ \lim_{\Delta t \rightarrow 0} \frac{W((t,t+\Delta t))}{\Delta t}= \lim_{\Delta t \rightarrow 0} \frac{1- W((a,t] \cup [t+ \Delta t,b))}{\Delta t}= \lim_{\Delta t \rightarrow 0} \frac{1- W((a,t]) - W( [t+ \Delta t,b))}{\Delta t}=\lim_{\Delta t \rightarrow 0} \frac{-W((a,t]) + W((a,t + \Delta t))}{\Delta t}$

Now i define $p:t \rightarrow W((a,t))$

fundamental theorem of calculus: $ \int_a^b p'(t) dt = p(b)-p(a)=W((a,b))-W((a,a))=1- W((a,a))$

But my problem is that i must show that the probability of one discrete point is $0$. How i can do these?

  • 2
    What you are trying to prove is simply fundamental theorem of calculus (the Newton–Leibniz theorem); for proof, see any calculus textbook, or wikipedia – Zoran Loncarevic Apr 16 '16 at 11:34
  • I edit my post with your hint! My problem is that i must show on this way that the probability of one discrete point is $0$ in a continously probability space. How i can do these with only the knowledge above? – Lauren Veganer Apr 16 '16 at 13:35
  • 2
    Now, you want to prove that $F(x)=W(a,x)=\int_a^x f(t) dt$ is continuous function; again, see any calculus textbook, or this answer, for example. – Zoran Loncarevic Apr 16 '16 at 14:17

1 Answers1

1

By definition, probability measure $W$ has density $f: [a,b] \to \mathbb R$ if and only if $$W(a,x)=\int_a^x f(t) dt,\quad x \in (a,b)$$

Your course may be using slightly different (but equivalent) definition of the density of probability measure; your question does not contain this definition — see more about this at the end of this answer.

If probability measure $W$ does have density $f$, then from this definition, it immediately follows that $$\int_a^b f(t) dt=W(a,b)=1$$

Let us prove, nevertheless, that $W(\{t\})=0$ for every $t \in (a,b)$ whenever probability measure $W$ has density $f$. Well, then $$W(\{t\})=\int_a^b \delta_t(x) f(t) dt=0$$ where $\delta_t(x)$ is $1$ when $x=t$ and zero otherwise. Why did I mention previosly that function $$F(x)=W(a,x)=\int_a^x f(t) dt$$ is continuous on $(a,b)$? I was thinking about the following proof that uses $\sigma$-additivity of probability measures.

You said that all you know about probability measure $W$ is that

  1. $W(a,b)=1$
  2. $A \cap B=\emptyset \implies W(A \cup B)=W(A)+W(B)$

Actually, the properties 1-2 are not enough for $W$ to be a probability measure — you need something more, a countable additivity (also called $\sigma$-additivity):

  1. $A_i \cap A_j = \emptyset, \ i \neq j \implies W ( \bigcup_{i = 1}^{\infty} A_i) = \sum_{i = 1}^{\infty} W (A_i)$

for any sequence $A_1,A_2,\ldots$ of measurable sets. I will use this property of probability measure $W$ to prove that $W(\{t\})=0$. Indeed, if $t_1,t_2,\ldots \in (a,b)$ is sequence such that $t_i \to 0$, we can write the singleton set $\{t\}$ as $$ \{t\}= \bigcap_{i = 1}^{\infty} [t, t + t_i) = [t, b) \backslash \bigcup_{i = 1}^{\infty} [t + t_i, b) = [t, t + t_1) \backslash \bigcup_{i = 1}^{\infty} [t + t_{i + 1}, t + t_i) $$

Sets forming last union above are disjoint, and from 3. we have that $$ W\{t\}= W [t, t + t_1) - \sum_{i = 1}^{\infty} W [t + t_{i + 1}, t + t_i) = 0 $$ since $$\begin{aligned} \sum_{i = 1}^k W [t + t_{i + 1}, t + t_i) &= W [t + t_{k + 1}, t + t_1) \\ &= F (t + t_1) - F (t + t_{k + 1}) \to W [t, t + t_1) \end{aligned}$$

Let us now return to the definition of density. If function $F(x)$ is differentiable on $(a,b)$, then probability measure $W$, by fundamental theorem of calculus, has density $$f(t)=F'(t)=\lim_{\Delta t \to 0} \frac{F(t+\Delta t)-F(t)}{\Delta t} =\lim_{\Delta t \to 0} \frac{W[t,t+\Delta t)}{\Delta t} =\lim_{\Delta t \to 0} \frac{W(t,t+\Delta t)}{\Delta t}$$ The last equation hold, because we now know that $W(\{t\})=0$. From the question, I got the impression you were using the last limit as the definition of density, but the existence of this limit for every $t \in (a,b)$ does not necessarily mean that measure $W$ has density. For example, if we define $W$ to be Dirac measure $$W(A)=\begin{cases} 1, & \text{if}\ \frac{1}{2} \in A \\ 0, & \text{otherwise} \end{cases}$$ then $W(t,t+\Delta t)=0$ for every $t \in (a,b)$ and sufficiently small $\Delta t$ and the limit above exists for every $t \in (a,b)$ and is equal to zero, but this measure does not have density.

  • Thanks! I´m not sure if i understand the structure of the proof right. To sum up without details: 1) Define $F: t \rightarrow W((a,t))$, 2) Under assumption that F is differentiable: $F'(t)=\lim_{\Delta t \rightarrow 0} \frac{F(t + \Delta t) - F(t)}{\Delta t}= \lim_{\Delta t \rightarrow 0} \frac{W([t, t+ \Delta t))}{\Delta t}= \lim_{\Delta t \rightarrow 0} \frac{W(t, t+ \Delta t))}{\Delta t} =f(t) $. 3)$ \int_a^b f(t) dt= F(b)-F(a)=W((a,b))-W((a,a))=1+ W(\emptyset)=1$ – Lauren Veganer Apr 18 '16 at 00:29
  • @LaurenVeganer I'm afraid my previous answer was not very clear (I was in a hurry yesterday, when I wrote that). I edited the answer to improve it - take a look at the new version. – Zoran Loncarevic Apr 18 '16 at 15:24
  • Thanks for all.But we define the last limit as the definition of density (what you show we defined). We don´t have the first definition, which you give to me! It isn´t a probability course and shouldn´t be general. Only so that we could apply it on our models. We call the function $f(t)= lim_{\Delta t \rightarrow 0} \frac{W((t, t+\Delta t))}{\Delta t}$ if she exists the density function of probability. – Lauren Veganer Apr 18 '16 at 16:53
  • Shoudn´t be the index i on the sum/unions? – Lauren Veganer Apr 18 '16 at 17:12
  • Sry one question how you came to this two steps: $\bigcap_{i=1}^{\infty} [t, t + t_i)=[t,b)\setminus \bigcup_{i=1}^\infty [t+t_i,b)=[t,t+t_1) \setminus \bigcup_{i=1}^\infty [t + t_{i+1}, t+ t_i)$? It isn´t de morgan law. – Lauren Veganer Apr 18 '16 at 17:32
  • 1
    @LaurenVeganer The first equation is de morgan: $\bigcap I_i=[t,b) \backslash \bigcup ([t,b) \backslash I_i$), the second I changed $[t,t+t_i)$ into $[t+t_{i+1},t+t_i)$ to make the union disjoint, since $[t,t+t_i)$ part was already included in the previous members of the union. – Zoran Loncarevic Apr 18 '16 at 17:54
  • Thanks. I´m right that your proof show that the densityfunction is the defined limit when the Funktion F ist differentiable? – Lauren Veganer Apr 18 '16 at 19:32
  • 1
    @LaurenVeganer Yes, that's right. – Zoran Loncarevic Apr 18 '16 at 22:25