1

I've been teaching calculus for several years and have some doubts about whether derivatives (and integration techniques) of common functions are useful and important outside mathematics and physics.

My question is:

Can you give an example of a natural problem outside mathematics and physics that can be solved using derivatives or integration, and cannot be solved simpler differently?

My motivation comes from trying to motivate students by good exercises.

The first constraint (naturality) excludes exercises such as "If $q$ units are produced in a factory, then your cost is $-0.3q^3+2q^2-\ldots$" and all of that kind, taught in microeconomics courses. The second constraint (cannot be solved simpler) exclude, for example, everything that leads to a minimum or maximum of quadratic expressions.

I'm aware of a few such instances, for example determining the length of the line segments in the solution of the Steiner tree problem for four points on a square (minimal road or electricity network connecting ABCD).

Peter Franek
  • 11,522

3 Answers3

1

There are plenty of differential equations in biology. The list from Wikipedia is:

And there are also ones coming from pharmacokinetics.

Of course you don't actually have to use the real differential equations coming from these fields, you could choose the mathematics you want to ask about (for example, a PDE) and then just say

The amounts of Drug X and Drug Y in a person's system satisfy the following system of PDE...

curious
  • 2,167
1

Applications of derivatives outside mathematics and physics

Would you consider finance and economics as part of $($applied$)$ mathematics ? Or did you have only “pure” mathematics in mind when you wrote this ? Anyway, the point is that all economic concepts whose name starts with the word “marginal” are basically the derivative or variation of the notion in question.

Lucian
  • 48,334
  • 2
  • 83
  • 154
  • I would be happy with some reasonable applications in economics and finance. I've seen those "marginal" things, but everything I have seen so far seemed to me to be artificial. The functions were either given without any motivation (just to practice formulas), or -- in more realistic cases -- they were taken from observations, but then there is no need to know that the derivative of $\ln(x)$ is $1/x$.. – Peter Franek Nov 08 '14 at 19:36
  • We can match an observation and a function and use those to test counterfactuals though. Then derivatives are necessary. Similarly, things like a firm's marginal cost are often unknown to an economist. But, with a model, calculus, and data, marginal cost can be estimated as the unknown in a first order condition. – Pburg Nov 08 '14 at 19:48
  • @PeterFranek: Would you consider engineering as a sub-domain of physics ? – Lucian Nov 08 '14 at 19:52
  • @PeterFranek: Optimization problems. They're practical necessity should be obvious to anyone. Furthermore, engineering without Fourier transforms, Laplace transforms, and Z-transforms $($and their inverses$)$, does not exist. – Lucian Nov 08 '14 at 20:21
  • I guess so. Could you be more specific please? -- some reference to a concrete problem, not too complicated to explain, if possible. – Peter Franek Nov 08 '14 at 20:25
  • @PeterFranek: Engineering without calculus is like mathematics without numbers. The field itself simply would not exist. The correct question would be: which engineering application does not use calculus, either directly or indirectly. Even the most basic approximations are the result of Taylor series, which rely on derivatives. – Lucian Nov 08 '14 at 20:37
  • Yes, as a theoretical basis, I agree. I mainly asked for good examples, closer to "directly" than "inderictly". Most practical computation I have seen are reduced to linear or piecewise linear computations/approximations / finite elements / finite dimensional vector spaces / discrete fourier transforms / data mining / searching in big graphs / etc etc. – Peter Franek Nov 08 '14 at 20:42
  • In order to approximate something, you must first know what that certain something is. You cannot approximate the unknown. Linear approximations are derivatives $($the tangent to the graph around a certain point$)$. They did not exactly drop out of the sky. Other polynomial approximations are based on Taylor series, as already mentioned. They did not drop out of the sky either. – Lucian Nov 08 '14 at 21:03
  • Yes, sure. I mean, I'm just looking for nice examples, easy to present, "such as" the Steiner tree problem solution mentioned in the OP.. – Peter Franek Nov 08 '14 at 21:04
  • 1
    For instance, this nice man had to write a program involving countless computations of trigonometric functions. Computing them would have required a lot of processor time. What ultimately saved him was Bhaskara's sine approximation formula. – Lucian Nov 08 '14 at 21:08
  • Nice link, thanks; but I wouldn't call this an application of calculus to a problem outside of mathematics. He just verified that the old formula for approximating sin function cannot be improved by changing the 4 and 16 constants. – Peter Franek Nov 08 '14 at 21:24
  • My point was that the methods use to deduce this “magic” formula were basically precursors to modern calculus. – Lucian Nov 08 '14 at 21:27
0

I think that given the way you have set up the problem, the answer could very well be negative. Because we could always talk about the quotient $$\frac{f(x+h)-f(x)}{h}$$ for an $h$ that is small enough, which is a simpler object than the derivative.