5

First, I'm a freshman student of physics, not of mathematics, so please excuse my ignorance of mathematics :)

Well, I'm reading the book "Huygens and Barrow, Newton and Hooke" by Vladimir Arnold, and one excerpt (at the page 43) called my attention greatly: "he used Taylor's formula for calculating derivatives rather than using the derivatives for the expansion of functions" ("he" refers to Sir Isaac Newton).

My main question is: How to obtain the series of the common elementary functions (trigonometric, exponential...) without using derivatives? And, for example, could calculus be developed in a way that we would get $\sin x = x - \frac{x^3}{3!} + \frac{x^5}{5!} - \frac{x^7}{7!} + \cdots$ and also the one for cosine and then use this series to find the derivative of sine by seeing that it reduces to the series of the cosine?

P.S. I don't know English very well, I just hope the question asked is that I wanted to ask! Thanks in advance to everyone who answer.

me_ravi_
  • 1,076
  • 1
    For example, consider $e^{ix}$ and assume you know the expansion of $e^y$. Replace $y$ by $ix$; separate the real and imaginary parts and you get Taylor series of $\sin(x)$ and $\cos(x)$. Is this OK ? – Claude Leibovici May 10 '15 at 09:23
  • How to obtain the coefficients without repeated differentiation? The method of undetermined coefficients is your savior?!! Or the grandiose "Newton's forward difference equation"! Or maybe not. :D I read that Lagrange wanted to reduce calculus to algebra, so maybe in his calculus books one can find the answer. – Sudoku Polo May 10 '15 at 10:26
  • See here http://math.stackexchange.com/a/2005935/4414 –  Nov 09 '16 at 23:43

4 Answers4

7

It is often not hard to solve some equation in form of series. E.g.

Let $y = \sin x$ be defined as the solution of $y'' + y = 0, y(0) = 0, y'(0) = 1$. Let's assume that Taylor expansion for $y(x)$ is $$ y(x) = \sum_{k=0}^\infty c_k x^k $$ and it converges for all $x$. Collecting $x^k$ terms in differential equation we have $$ (k + 2)(k + 1) c_{k + 2} + c_k = 0,\quad c_{k+2} = -\frac{c_k}{(k+1)(k+2)}. $$ From the initial conditions we have $$ c_0 = 0\\ c_1 = 1. $$ Hence our expansion for $\sin x$ is $$ \sin x = x - \frac{x^3}{6} + \dots $$

uranix
  • 7,503
4

In general, there is no way to do this. The uniqueness of the expansion means that the coefficient of the $kth$ term must be given by $\frac{f^{k}\left ( x_{0} \right )}{k!}$. However, in certain cases one can develop the series using short cuts, like the one in the first answer. As another example, take

$\frac{1}{1-x}=\sum _{k}x^{k}$, and proceed as follows:

$\frac{1}{1-(-x^{2})}=\sum _{k}(-x^{2})^{k}$

$\frac{1}{1+x^{2}}=\sum _{k}\left ( -1 \right )^{k}x^{2k}$

Integrating both sides and evaluating at $x_{0}=0$ (so that the constant of integration is 0), you get

tan$^{-1}x=\sum _{k}\left ( -1 \right )^{k}\frac{x^{^{2k+1}}}{2k+1}$ So now you have the Maclaurin series for the inverse tangent.

Note that the interval of convergence of the original series is (-1,1), and you have to deal with convergence issues with each substitution. Any book on real analysis will cover this topic.

Matematleta
  • 29,139
4

Defining $e^x = \lim_{n \to \infty}(1 + x/n)^n $, we can use the binomial expansion to get the familiar taylor series of $e^x$. Then one can use $e^{ix} = \cos(x) + i\sin(x)$ to get the taylor series for $\cos$ and $\sin$.

This seems like a plausible path for Newton(/Euler) to use in deriving the taylor series for a large amount of important functions.

Asvin
  • 8,229
3

I can't verify that Bromwich didn't use calculus to derive the equation below. But I remember doing something like the derivation below in high school back in 1966. So I will put it forth that there is a series that can be arrived at with recourse to calculus for which the method shown below does work.

Here is one example. Bromwich states that, for odd n

$$ \sin(na) = nx - \frac{n(n^2-1^2)}{3!}x^3 + \frac{n(n^2-1^2)(n^2-3^2)}{5!}x^5 - \cdots $$

where $x = \sin a$. Now we choose a such that $\theta = na$ as $n\to\infty$. So, as $n\to\infty$, $nx \to n \sin a\to na \to \theta$.

Hence, as $n \to \infty$, \begin{align} \sin(\theta) &= nx - \frac{n(n^2-1^2)}{3!}x^3 + \frac{n(n^2-1^2)(n^2-3^2)}{5!}x^5 - \cdots\cr &= nx - \frac{nx((nx)^2-x^2)}{3!} + \frac{nx((nx)^2-x^2)((nx)^2-(3x)^2)}{5!} - \cdots\cr &\to \theta - \frac{\theta^3}{3!} + \frac{\theta^5}{5!} - \cdots\cr \end{align}

Addendum

I'm sure that my logic below is not up to snuff by modern standards. None-the-less, it is still compelling.

Let's assume that n is an odd number. Then

$(\sin \theta + i \cos \theta)^{n} = \sum_{k=0}^{n} \binom{n}k (\sin \theta)^k (i \cos \theta)^{n-k}$

$\sin(n \theta) + i \cos (n \theta) = \sum_{k=0}^{n} \binom{n}k (\sin \theta)^k (i \cos \theta)^{n-k}$

$\displaystyle \sin(n \theta) = \sum_{k=0}^{(n-1)/2} \binom{n}{2k+1} (\sin \theta)^{2k+1} (-1)^k (\cos \theta)^{n-2k-1}$

Let $\theta \to 0$ in such a way that $n \theta \to x$ at the same time. This can be done by letting $\theta = \dfrac{x}{n}$ and then letting $n \to \infty$.

Now we examine $\binom{n}{2k+1} (\sin \theta)^{2k+1}$ as $n \to \infty$.

\begin{align*} \binom{n}{2k+1} (\sin \theta)^{2k+1} &= \dfrac{(n)(n-1)(n-2)\cdots (n-2k)}{(2k+1)(2k)(2k-1) \cdots (1)} (\sin \theta)^{2k+1}\\ &= \dfrac{(n) \sin \theta}{2k+1}\; \dfrac{(n-1) \sin \theta}{2k}\; \dfrac{(n-2) \sin \theta}{2k-1}\;\cdots \dfrac{(n - 2k) \sin \theta}{1}\\ &\to \dfrac{(n) \theta}{2k+1}\; \dfrac{(n-1) \theta}{2k}\; \dfrac{(n-2) \theta}{2k-1}\;\cdots \dfrac{(n - 2k) \theta}{1}\\ &\to \dfrac{x}{2k+1}\; \dfrac{x-\theta}{2k}\; \dfrac{x-2\theta}{2k-1}\;\cdots \dfrac{x - 2k \theta}{1}\\ &\to \dfrac{x^{2k+1}}{(2k+1)!} \end{align*}

Since $\cos \theta \to 1$ as $\theta \to 0$, letting $n \to \infty$, we get

$$\displaystyle \sin(x) = \sum_{k=0}^{\infty} (-1)^k \dfrac{x^{2k+1}}{(2k+1)!}$$