17

Suppose one has the an Ito process of the form:

$$dX_t = b(X_t)dt + \sigma(X_t)dW_t$$

The following is an excerpt from wikipedia

enter image description here

My question is on how to derive this operator? It looks very similar to what you get when using Ito's Lemma. So I start with applying Ito's Lemma with f to get:

$$df = \frac{\partial f}{\partial t}dt + \sum_i\frac{\partial f}{\partial x_i}dx_i + \frac{1}{2}\sum_{i,j}\frac{\partial^2 f}{\partial x_ix_j}[dx_i,dx_j]$$

Which then becomes:

$$df = \left[ \frac{\partial f}{\partial t}dt + \sum_i b_i(X_t)\frac{\partial f}{\partial x_i}dt + \frac{1}{2}\sum_{i,j}(\sigma\sigma^T)_{i,j}\frac{\partial^2 f}{\partial x_i \partial x_j} \right]dt + \sum_i \sigma_i(X_t) \frac{\partial f}{\partial x_i}dW_t$$

Hopefully I got that correct. What I'm unsure about is how to proceed in order to compute $Af(x)$. I would think the next step is to integrate $df$, but it's not clear to me what happens after (I know these infinitesimal generators are rooted in semigroup theory, but I have very little experience in that).

Pedro
  • 18,817
  • 7
  • 65
  • 127
Brenton
  • 3,646

1 Answers1

18

Hints:

  1. Note that $f$ does not depend on the time $t$, therefore the term $\frac{\partial}{\partial t} f$ is superfluous.
  2. Take expectation on both sides, then the stochastic integral $\dots dW_t$ vanishes, because it is a martingale.
  3. Use Fubini's theorem and the fundamental theorem of calculus, $$\frac{1}{t} \int_0^t \mathbb{E}^xg(X_s) \, ds \stackrel{t \to 0}{\to} \mathbb{E}^xg(X_0)= g(x).$$
saz
  • 120,083
  • Thanks for the help! Doing steps 1 and 2 gives me this: $E[f] = E\left[ \int_0^t \left[ \sum_i b_i(X_t)\frac{\partial f}{\partial x_i}dt + \frac{1}{2}\sum_{i,j}(\sigma\sigma^T)_{i,j}\frac{\partial^2 f}{\partial x_i \partial x_j} \right] ds \right]$. I'm not sure I'm understanding how to use Step 3 though. – Brenton Jul 03 '15 at 15:39
  • 1
    @Brenton ... because you didn't write up Itô's formula properly. You can use these short-notations if you know what you do, but at the beginning it's much better to write it in full length. by Itô's formula $$f(X_t)-f(X_0) = \int_0^t \dots , dW_s + \int_0^t \dots , ds.$$ Fill in the "$\dots$" and take then the expectation on both sides. Then you'll see that e.g. the left-hand side in your comments reads $\mathbb{E}^x f(X_t)-\mathbb{E}^x f(X_0)$. – saz Jul 03 '15 at 16:27
  • So if I do it this way (to my understanding since I've never taken an SDE course):

    $$f(X_t) = f(X_0) + \int_0^t \sum_i \frac{\partial f}{\partial x_i}b_i + \frac{1}{2}\sum_{i,j}\frac{\partial^2 f}{\partial x_ix_j}(\sigma \sigma^T)_{ij} ds + \int_0^t \sum_i \frac{\partial f}{\partial x_i}\sigma_i dW_s$$

    So taking expectations should give:

    $$E^x \left[f(X_t)\right] = E^x \left[f(X_0)\right] + E\left[\int_0^t \sum_i \frac{\partial f}{\partial x_i}b_i + \frac{1}{2}\sum_{i,j}\frac{\partial^2f}{\partial x_ix_j}(\sigma \sigma^T)_{ij} ds\right]$$

    I feel like I made a mistake

    – Brenton Jul 03 '15 at 17:38
  • @Brenton Actually, it should read $$f(X_t) = f(X_0) + \int_0^t \left( \sum_i \frac{\partial f(X_s)}{\partial x_i}b_i(X_s) + \frac{1}{2}\sum_{i,j}\frac{\partial^2 f(X_s)}{\partial x_ix_j}(\sigma(X_s) \sigma(X_s)^T)_{ij} \right) ds + \int_0^t \sum_i \frac{\partial f(X_s)}{\partial x_i}\sigma_i(X_s) dW_s.$$ – saz Jul 03 '15 at 17:57
  • Ok that looks essentially what I have. So taking expectations should give the following:

    $$E^x[f(X_t)] = f(X_0) + E\left[ \int_0^t A(X_s) ds \right]$$ where $A(X_s)$ is everything in the $ds$ integral above.

    By Fubini, I should have: $$E^x[f(X_t)] = f(X_0) + \int_0^t E\left[A(X_s)\right] ds $$.

    Is that correct?

    – Brenton Jul 05 '15 at 00:24
  • @Brenton Yeah; however, I recommend you to write $f(x)$ instead of $f(X_0)$ (as long as you keep in mind that $X_0 = x$ $\mathbb{P}^x$-almost surely, there is nothing wrong about the equation in your previous comment, but you have to be more careful if you want to play around with $x$.) – saz Jul 05 '15 at 05:19
  • Ah ok. I was not aware that $f(X_0) = f(x)$ a.s. That might've been part of my confusion. So I should have that:

    $$LV(x) = \lim_{t \rightarrow 0} \frac{1}{t} \int_0^t E[A(X_s)] ds$$

    My last question is why as $t \rightarrow 0$ does the expression tend to $E[A(X_0)] = A(x)$. I feel like this is an elementary calculus fact (I found it here: http://www.mecca.org/~halfacre/math/lesson29.htm but I don't really understand their reasoning for the last step in the proof, and it's been about 8 years since I took calculus). I appreciate the help a lot

    – Brenton Jul 05 '15 at 05:34
  • @Brenton By the fundamental theorem of calculus, $$\frac{1}{t} \int_0^t g(s) , ds \to g(0)$$ as $t \to 0$ for any function $g$ which is continuous at $0$. Use this for $g(s) := \mathbb{E}^x[Af(X_s))$. (Don't forget the $x$; it really makes a difference whether you write $\mathbb{E}$ or $\mathbb{E}^x$.) – saz Jul 05 '15 at 05:38
  • Ah ok yes I see why now (mainly needed a calculus review haha). So when they write $f(x)$, it can be thought of as $f(X_0)$ where $X_0 = x$ a.s.? – Brenton Jul 05 '15 at 07:15
  • @Brenton $X_0 = x$ $\mathbb{P}^x$-almost surely implies $f(x) = f(X_0)$ $\mathbb{P}^x$-almost surely; yes. – saz Jul 05 '15 at 08:02
  • Thank you very much. I've seen $LV(x)$ used in some papers sometimes to make calculations easier. I'm curious why else the infinitesimal generator is important, like what else it tells you about an SDE. I know for Brownian Motion, the infinitesimal generator is used to solve the heat equation, but I'd imagine it has more uses – Brenton Jul 05 '15 at 15:40
  • @Brenton You might have a look at this question: http://math.stackexchange.com/q/694227/ My answer concentrates on Brownian motion, but actually all these things can used/extended for a much larger classes of proceses (including SDEs). – saz Jul 05 '15 at 15:46
  • Thank you for this. You've really helped me a lot. I'm going to go accept your answer. – Brenton Jul 05 '15 at 16:17
  • Hmm, it's still not totally clear. You said one can apply the fundamental theorem of calculus if $g$ is continuous at $0$, but then you didn't argue why $g(s) = \Bbb E^{x}[Af(X_{s})]$ is continuous at $0$. Why is this $g$ continuous at $0$? – layman Dec 01 '16 at 23:44
  • 2
    @user46944 Note that $(X_t)_{t \geq 0}$ has continuous sample paths and $y \mapsto Af(y)$ is continuous (for any "nice" $f$). Therefore the continuity of $s \mapsto \mathbb{E}^x (Af(X_s))$ is a direct consequence of the dominated convergence theorem. – saz Dec 02 '16 at 06:45
  • @saz How do you know that $\mathbb E\left[\int_0^t \sigma (X_s) f'(X_s) d W_s\right]$ is a martingale? This is not the case in general. https://almostsuremath.com/2010/08/16/failure-of-the-martingale-property/ – Matheus Manzatto Mar 19 '21 at 12:42
  • 1
    @MatheusManzatto By assumption, $f$ (and hence also its derivative $f'$) are compactly supported. This means that square integrability of the integrand is satisfied if, say, $\sigma$ is continuous (because then $\sigma$ is bounded on compact sets). Square integrability then entails that it is a "real" martingale. – saz Mar 20 '21 at 15:31
  • Thank you very much for the explanation!! – Matheus Manzatto Mar 20 '21 at 15:51
  • Dear @saz what is $\sigma_i$? Isn't $\sigma(X_t)$ a matrix valued function? – Barreto Oct 11 '22 at 07:45
  • @MAB it should denote the $i$th row of the matrix $\sigma$, so that $\sigma_i(X_s) dW_s$ is the $i$th component of the stochastic part of $dX_s$. – Juno Kim Jun 12 '23 at 15:00