5

I am taking a course on smooth manifolds. We have just defined a tangent space of a manifold $M$ at a point $p\in M$ using paths $\gamma:\mathbb{R}\to M$, $\gamma(0)=p$. So $T_pM:=\{\gamma:\mathbb{R}\to M\;|\;\gamma(0)=p\}/\sim$, where $\sim$ is the equivalence relation of two paths having the same speed. That is, for two paths $\gamma_1,\gamma_2$ and a chart $\phi$, $$\left.\frac{d}{dt}\phi\circ\gamma_1\right|_{t=0}=\left.\frac{d}{dt}\phi\circ\gamma_2\right|_{t=0}\iff\text{$\gamma_1$ and $\gamma_2$ have the same speed.}$$ We defined addition and scalar multiplication of paths as follows: $$[\gamma_1]+\lambda[\gamma_2]:=[\phi^{-1}(\phi\circ\gamma_1+\lambda\phi\circ\gamma_2)],$$ where $\phi$ is a chart.
Now there's an exercise that I really don't know how to do. Given a smooth function $f:\mathbb{R}^m\to\mathbb{R}^n$, compute $f_*:T_p\mathbb{R}^m\to T_{f(p)}\mathbb{R}^n$ in the basis given by the coordinate vectors. Here $f_*$ is the function defined by $f_*:[\gamma]\mapsto[f\circ\gamma]$.
I am not even sure where to start. I have already shown that $f_*$ is linear.

EDIT:
I think this works. Write $f=(f_1,\ldots,f_n)$ where $f_i:\mathbb{R}^m\to\mathbb{R}$ and let $\gamma:\mathbb{R}\to\mathbb{R}^m$ be a path with $\gamma(0)=p=(p_1,\ldots,p_m)\in\mathbb{R}^m$. Let $\tilde{\gamma}_i(t):=(p_1,\ldots,p_{i-1},p_i+t,p_{i+1},\ldots,p_m)$ be the path along the $i$th basis vector in $\mathbb{R}^m$ through $p$. Then we have $\lambda_i\in\mathbb{R}$ such that $\gamma=\sum_{i=1}^m\lambda_i\tilde{\gamma}_i$. We have $$\begin{aligned} f_*[\gamma]&=f_*\left[\sum_{i=1}^n\lambda_i\tilde{\gamma}_i\right]\\ &=\sum_{i=1}^m\lambda_if_*[\tilde{\gamma}_i]\\ &=\sum_{i=1}^m\lambda_i[f\circ\tilde{\gamma}_i]\\ &=\sum_{i=1}^m\lambda_i\left.\frac{d}{dt}f\circ\tilde{\gamma}_i(t)\right|_{t=0}\\ &=\sum_{i=1}^m\lambda_iDf(\tilde{\gamma}_i(0))\left.\frac{d}{dt}\tilde{\gamma}_i(t)\right|_{t=0}\\ &=\sum_{i=1}^m\lambda_iDf(p)\left.\frac{d}{dt}\tilde{\gamma}_i(t)\right|_{t=0}\\ &=Df(p)\sum_{i=1}^m\left.\frac{d}{dt}\lambda_i\tilde{\gamma}_i\right|_{t=0}\\ &=Df(p)[\gamma]. \end{aligned}$$ So this yields that the matrix of $f_*$ is $Df$ at $p$.
Is this correct?

Joffysloffy
  • 1,334
  • Have you already considered what $T_p\mathbb{R}^m$ looks like, given any $p\in\mathbb{R}^m$? – HSN Sep 25 '14 at 09:21

3 Answers3

2

First, $\sim$ is the equivalence class of paths where the paths have the same velocity at time $t = 0$, not the same speed.

Since $f_*$ is linear, you know it's enough to compute $f$ for the given basis of coordinate vectors, which I'll denote $(\partial_{x^i})_p$. Moreover, by composing with translations we may as well align our charts so that $p = 0 \in \mathbb{R}^m$ and $f(0) = 0 \in \mathbb{R}^n$. This isn't strictly necessary, but given our definition of path addition it makes notation much cleaner.

Now, by definition, the $i$th coordinate vector $(\partial_{x^i})_0 \in T_0 \mathbb{R}^m$ at the point $0 \in \mathbb{R}^m$ is the equivalence class containing the curve $\gamma_i$ such that (1) is at $0$ at time zero (i.e., $\gamma_i(0) = 0$), and moves in the $i$th coordinate direction with unit speed, namely, $$\gamma_i(t) = (0, \ldots, 0, t, 0, \ldots, 0),$$ where $t$ is in the $i$th slot.

Now, we can compute the curve $f \circ \gamma_i$ in $\mathbb{R}^n$ that represents the pushforward $f_* [\gamma_i] = [f \circ \gamma_i]$. If we write $f$ in terms of its component functions, so that $$f(x^1, \ldots, x^m) = (f_1(x^1, \ldots, x^m), \ldots, f_n(x^1, \ldots, x^m)),$$ its $j$th component is $$f_j(x^1, \ldots, x^m),$$ and the $j$th component $(f \circ \gamma_i)_j = f_j \circ \gamma_i$ is $$f_j(0, \ldots, 0, t, 0, \ldots, 0).$$

The initial velocity $\left.\frac{d}{dt}\right\vert_0(f \circ \gamma_i)$ has $j$th component (in the coordinates $(y^j)$ on $\mathbb{R}^n$) is $$\left.\frac{d}{dt}\right\vert_0(f \circ \gamma_i)_j = \left.\frac{d}{dt}\right\vert_0(f \circ \gamma_i)_j = \left.\frac{d}{dt}\right\vert_0 f_j(0, \ldots, 0, t, 0, \ldots, 0),$$ which by the chain rule is just $$\frac{\partial f_j}{\partial x^i}(0),$$ and so the initial velocity of $f \circ \gamma_i$ is $$\left(\frac{\partial f_1}{\partial x^i}(0), \ldots, \frac{\partial f_n}{\partial x^i}(0)\right) \qquad (\ast)$$

Now, to compute the pushforward with respect to the coordinate vector basis $(\partial_{y^j})$, we must find curves that represent those coordinate vectors, and then decompose some representative of $[f \circ \gamma_i]$ as a linear combination of those curves. As before, we can pick as representative of each $\partial_{y^j}$ the curve $$\theta_j(t) := (0, \ldots, 0, t, 0, \ldots, 0),$$ where $t$ is in the $j$th slot.

It's now easy to write down a (unique) linear combination of the $\theta_j$'s in $[f \circ \gamma_i]$, that is, that has initial velocity $(\ast)$, because we already know the components w.r.t. to the coordinate basis, namely $$\sum_{j = 1}^n \frac{\partial f_j}{\partial x^i}(0) \theta_j,$$ and passing to equivalence classes gives that $$f_* [\gamma_i] = [f \circ \gamma_i] = \left[\sum_{j = 1}^n \frac{\partial f_j}{\partial x^i}(0) \theta_j\right] = \sum_{j = 1}^n \frac{\partial f_j}{\partial x^i}(0) [\theta_j].$$

So, the $(j, i)$ entry of the matrix representation of the pushforward $f_*: T_0 \mathbb{R}^m \to T_0 \mathbb{R}^n$, with respect to the bases $(\partial_{x^i}) = ([\gamma_i])$ of $T_0 \mathbb{R}^m$ and $(\partial_{y^j}) = ([\theta_j])$ of $T_0 \mathbb{R}^n$, is the partial derivative $\frac{\partial f_j}{\partial x^i}(0)$. But by definition this matrix is precisely the Jacobian $(Df)(0)$.

As an aside, in the setting of differential geometry, I actually prefer the functorial notation $T_p f$ instead of $f_*$ (or, as this exercise justifies, $(Df)(p)$, so that the pushforward of the map $f: M \to N$ at $p \in M$ is $T_p f: T_p M \to T_{f(p)} N$, which in particular emphasizes that the pushforward really is a map at a point.

Travis Willse
  • 99,363
  • Yes, a few times I erased things I realized I didn't want to say (and your answer came out just as well I think). It's good to revisit the definitions of vectors from time to time, I think, since most of them are operationally pretty far from how they are actually used in practice. – Travis Willse Sep 25 '14 at 09:43
  • Thanks for your help! I have edited my question with my solution. Would you please check it for me? – Joffysloffy Sep 26 '14 at 16:02
  • You're welcome, I hope this is useful for you. In the fourth equality, you replace a vector (equivalence class) with the value of the time derivative, and we don't have a reason to equate these quantities (not yet, anyway, though you'll probably very soon learn you can compute them). Using the definitions at hand, you'll want to compute $[f \circ \gamma]$, find a nice path in $\mathbb{R}^n$ that has the same speed, and then write that path as a linear combination of the coordinate vectors at $T_{f(p)} \mathbb{R}^n$. (In the end, the matrix of $f_*$ turns out to be the Jacobian of $f$.) – Travis Willse Sep 26 '14 at 16:32
  • Ah, yes, I was a little wary about that step, but I used Your Ad Here's answer for that. I honestly don't know how I would go about finding a path with the same speed as $[f\circ\gamma]$, let alone compute the matrix of $f_*$ that way. Like, I don't quite see where all the derivatives would come in. – Joffysloffy Sep 26 '14 at 16:47
  • 1
    @Joffysloffy I added the rest of an argument, which in particular is very explicit about decomposing a representative of $[f \circ \gamma_i]$ in terms of a basis of $T_{f(p)} \mathbb{R}^n$. For convenience I've assumed we've translated (which doesn't affect the pushforward computation) our charts so that $p = 0$ and $f(p) = 0$. This isn't strictly necessary, but it makes dealing with writing down linear combinations of curves using our curve addition rules much cleaner. – Travis Willse Sep 27 '14 at 07:39
  • Oh, thank you! I understand everything you did now; this is very helpful! – Joffysloffy Sep 27 '14 at 20:00
  • 1
    @Joffysloffy You're welcome, I'm glad you found it useful. – Travis Willse Sep 28 '14 at 03:35
2

Let $d_{1,p},d_{2,p},\dots,d_{m,p}$ be paths on $\mathbb{R}^n$ such that $d_{i,p}(0)=p$ and $d^\prime_{i,p}(0)=e_i$ where $e_i=(0,\dots,0,1,0,\dots,0)\in\mathbb{R}^m$ is the $i$th unit vector.

Then $([d_{i,p}])_{i=1,\dots,m}$ forms a basis of $T_p\mathbb{R}^m$.

You should think of $[d_{i,p}]\in T_p\mathbb{R}^m$ as the partial derivative in the $i$th coordinate direction based in the point $p$.

Now the job is to compute $f_*$ in terms of this basis. That is, compute the coefficients in the expansion of $f_*([d_{i,p}])$ in terms of the basis $([d_{j,f(p)}])_{j=1,\dots,n}$.

So we start

$$f_*([d_{i,p}])=[f\circ d_{i,p}]\in T_{f(p)}\mathbb{R}^n$$

We now need to compute $\frac{d}{dt} (f\circ d_{i,p})(t)\left.\right|_{t=0}$, since the $j$th component of the velocity vector of the path at $0$ will be exactly the coefficient of $[d_{j,f(p)}]$. Use the chain rule and the definition of $d_{i,p}$!

You will see that the resulting matrix of coefficients representing $f_*$ is simply the Jacobian of $f$.

J.R.
  • 17,904
0

Heh, here you just need to compute.
Let $\gamma$ be a curve on $\mathbb{R}^n$, what does $\frac{d}{dt}f(\gamma)$ look like? Try using the chain rule.
If it helps, try this simple example, $f:\mathbb{R}^2 \to \mathbb{R}^3;(x,y)\to(x,y,x^2+y^2)$. Compute the pushforward of the curves $\gamma_1(t) = (cos(t),sin(t))$, $\gamma_2(t) = (1,t)$, $\gamma_3(t) = (1,cos(t))$ so you can see what's going on.
As a worked example take $\gamma(t) = (t,t)$.
$\frac{d}{dt}f(\gamma(t)) = \frac{d}{dt} (t,t,2t^2) = (1,1,2t)$.

Now, let us look at a slightly more general example, $\gamma$ is just some curve.
$\frac{d}{dt}f(\gamma(t)) = \frac{d}{dt} (\gamma_x,\gamma_y,\gamma_x^2 +\gamma_y^2) =(\gamma'_x,\gamma'_y,\gamma'_x\gamma_x+\gamma'_y\gamma_y)$
We can translate this into $\frac{\partial}{\partial x}$ as so (remember that $\gamma_x$ is just the x component of the curve;
$(\gamma'_x,\gamma'_y,\gamma'_x\gamma_x+\gamma'_y\gamma_y) = (1+\gamma_x)\frac{\partial}{\partial x} + (1+\gamma_y)\frac{\partial}{\partial y}$

Has this cleared things up for you? If you still need help I'll work through the original problem for you but I think it's usually better to struggle with these things when you start off. As they say, you only learn by making mistakes!

If you'd like a supplement I'd recommend An Introduction to Manifolds by Loring Tu or An Introduction to Smooth Manifolds by John Lee.