What you have to prove is that
\begin{align}
\dfrac{\partial J}{\partial t}\bigg|_{(t,x)} &= J_t(x) \cdot (\text{div} f)(\phi_t(x))
\end{align}
Now, using the chain rule, we have
\begin{align}
\dfrac{\partial J}{\partial t}\bigg|_{(t,x)} &= \dfrac{d}{ds} \bigg|_{s=0} J(t+s,x).
\end{align}
Now, I leave it to you to plug in the definition of $J(t+s,x)$, use the flow-property of $\phi$, namely $\phi_{t+s} = \phi_s \circ \phi_t$, chain rule once again in the middle etc to show that
\begin{align}
\dfrac{\partial J}{\partial t}\bigg|_{(t,x)} &= \left( \dfrac{d}{ds} \bigg|_{s=0} \det \left( D(\phi_s)_{\phi_t(x)}\right) \right) \cdot J_t(x) \tag{$\ddot{\smile}$}
\end{align}
Now, to clear up all the clutter, observe what kind of object we have in brackets. For each $s \in \Bbb{R}$, $D(\phi_s)_{\phi_t(x)}$ is a linear operator $\Bbb{R}^n \to \Bbb{R}^n$ (and it is also invertible because $\phi_s: \Bbb{R}^n \to \Bbb{R}^n$ is invertible). For ease of notation, let's call this linear operator $A(s)$. So, we set
\begin{align}
A(s) &:= D(\phi_s)_{\phi_t(x)}
\end{align}
Now, we have a map $A: \Bbb{R} \to \mathcal{L}(\Bbb{R}^n)$, such that $A(0) = I$ (actually the domain of $A$ will be the set of times for which the flow is defined, but all that matters for us is that the domain contains an open interval around the origin). Also, one can check that smoothness of the flow $\phi: \Bbb{R} \times \Bbb{R}^n \to \Bbb{R}^n$ implies smoothness of the map $A$. So, our task is now to figure out what the following derivative is
\begin{align}
\dfrac{d}{ds} \bigg|_{s=0} \det(A(s)) = ?
\end{align}
Well, this is actually a simple application of the chain rule. Essentially, we're asking what's the derivative $(\det \circ A)'(0)$. Here, we think of $\det$ as a function between vector spaces $\mathcal{L}(\Bbb{R}^n) \to \Bbb{R}$ (as a technical aside, since the determinant of a matrix is a certain polynomial function of the entires, $\det$ is a smooth map). Now,
\begin{align}
(\det \circ A)'(0) &= D(\det)_{A(0)}\left( A'(0) \right) \tag{chain rule} \\
&= \dfrac{d}{ds} \bigg|_{s=0} \det(A(0) + s A'(0)) \tag{chain rule in reverse} \\
&= \dfrac{d}{ds} \bigg|_{s=0} \det(I + s A'(0)) \\
&= \text{Tr}(A'(0)) \tag{$*$}
\end{align}
Note that $(*)$ is really more of a linear algebra result rather than a calculus result, so I'll prove it at the end. Let's now use this this result in $(\ddot{\smile})$. Doing so, we find that
\begin{align}
\dfrac{\partial J}{\partial t}\bigg|_{(t,x)} &= \text{Tr}\left( \dfrac{d}{ds} \bigg|_{s=0} D(\phi_s)_{\phi_t(x)}\right) \cdot J_t(x) \\
&= \text{Tr} \left( D \left( \dfrac{d}{ds} \bigg|_{s=0} \phi_s\right)_{\phi_t(s)} \right) \cdot J_t(x) \tag{i}\\
&= \text{Tr}(Df_{\phi_t(x)}) \cdot J_t(x) \tag{ii}\\
&= (\text{div} f)(\phi_t(x)) \cdot J_t(x)\tag{iii}
\end{align}
In (i), the exchange of the two derivatives boils down to equality of mixed partial derivatives $\dfrac{\partial}{\partial s} \dfrac{\partial }{\partial x} = \dfrac{\partial}{\partial x} \dfrac{\partial}{\partial s}$ (i.e we're interchanging the time and space derivatives of the function $\phi: \Bbb{R} \times \Bbb{R}^n \to \Bbb{R}^n$). Next, (ii) is simply by definition of $\phi$ being the flow of the vector field $f$. Finally, (iii) is because the trace of the derivative is the sum of the diagonal entries, which is exactly the divergence of that vector field.
So, now why is $(*)$ true? More generally, we have the following useful lemma (which is very often used).
Let $B \in M_{n \times n}(\Bbb{R})$ be any matrix. Then,
\begin{align}
\det(I + sB) &= 1 + s \cdot \text{Tr}(B) + \mathcal{O}(s^2) \quad \text{as $s \to 0$}.
\end{align}
So, clearly, if we take the derivative at $s=0$, the result will be $\text{Tr}(B)$ which will prove $(*)$.
You could try to prove this by directly calculating the determinant by using cofactor expansion; I've never tried it because it looks disgusting. Instead here's a simpler proof (in my opinion) which invokes some linear algebra. We shall regard $B$ as an element of $M_{n \times n}(\Bbb{C})$. By doing so, the characteristic polynomial of $B$ splits (by the fundamental theorem of algebra). As a result, the matrix $B$ is similar (over $\Bbb{C}$) to an upper-triangular matrix, call this matrix $U$. For example, if you wish, you may take this matrix $U$ to be the Jordan-canonical form of $B$.
In any case, we know that there exist matrices $U,P \in M_{n \times n}(\Bbb{C})$, with $U$ upper-triangular and $P$ invertible such that
\begin{align}
B &= PUP^{-1}.
\end{align}
Hence,
\begin{align}
\det(I + sB) &= \det(I + s PUP^{-1}) \\
&= \det(P(I + sU)P^{-1}) \\
&= \det(I + sU).
\end{align}
Since $I + sU$ is also upper triangular, the determinant is the product of the diagonals, say $(1 + s\lambda_1) \cdots(1+ s\lambda_n)$ (where the $\lambda_i$ are diagonal entries of $U$). Hence,
\begin{align}
\det(I + sB) &= 1 + s (\lambda_1 + \dots \lambda_n) + \mathcal{O}(s^2) \\
&= 1 + s \cdot \text{Tr}(U) + \mathcal{O}(s^2) \\
&=1 + s \cdot \text{Tr}(B) + \mathcal{O}(s^2),
\end{align}
where in the last line, I used the fact that trace of similar matrices is equal. This completes the proof.