14

Prove $\sin(\pi/2)=1$ using the Taylor series definition of $\sin x$, $$\sin x=x-\frac{x^3}{3!}+\frac{x^5}{5!}-\cdots$$

It seems rather messy to substitute in $\pi/2$ for $x$. So we have $$\sin(\pi/2)=\sum_{n=0}^{\infty} \frac{(-1)^n(\pi/2)^{2n+1}}{(2n+1)!}.$$

I'm not too sure where to go from here. Any help would be appreciated! Thanks!

Arturo Magidin
  • 398,050
hawaii99
  • 375
  • 8
    @Caligirl11: Please try to learn some mark-up for posting your questions, rather than relying on others to make them readable. Also: you have posted five other questions, and have yet to accept any answer. If a prior question has been answered to your satisfaction, be sure to formally "accept" an answer (the on you find most helpful, best, etc) by clicking on the checkmark next to the question. – Arturo Magidin Apr 19 '11 at 00:42
  • 5
    Is this a homework question? It's very strange. – Qiaochu Yuan Apr 19 '11 at 00:45
  • 1
    Ok sure thank you. I'm sorry I'm new to this website. I wasn't aware of the checkmark system. I will do that from now on. – hawaii99 Apr 19 '11 at 00:55
  • 3
    Doesn't seem strange to me. Someone comes up to you with an infinite series. If you don't recognize it as the Taylor series for $\sin x$, evaluated at $x=\pi/2$, can you still contrive to prove it converges to $1$? – Gerry Myerson Apr 19 '11 at 00:59
  • @Gerry: I mean it's strange in the sense that most calculus textbooks I can think of would not teach the kind of thinking necessary to solve a problem like this (unless there's an easy solution I'm missing). – Qiaochu Yuan Apr 19 '11 at 01:27
  • 2
    Mosquito-nuker's solution: let $f(t)=\frac1{1+t^2}$ and $g(t)=\int_{-1}^t f(u)\mathrm du$; prove $g(1)=\frac{\pi}{2}$, compose the series for $\sin;t$ and $g(t)$ ... you get the drift. – J. M. ain't a mathematician Apr 19 '11 at 01:52
  • @Qiaochu, yes, it would be strange to see this problem in a Calculus textbook (except, maybe, Apostol). I assumed it just came to caligirl whilst she was contemplating the sine function. – Gerry Myerson Apr 19 '11 at 01:59
  • Yes so basically the problem is to show that the infinite series converges to 1. It is easy to show that it converges by the ratio test, I'm just not sure how to show it converges to 1. – hawaii99 Apr 19 '11 at 03:02
  • 21
    How are you defining $\pi$ here? What facts about $\sin(x)$ are you allowed to use? – Andrés E. Caicedo Apr 19 '11 at 05:26
  • @hawaii99 - If you don't mind taking the time to choose an acceptable answer (by clicking the "check mark" below the number of votes for that question), that would be appreciated. – The Chaz 2.0 Nov 09 '11 at 01:16

6 Answers6

12

This is not really an answer, or at least not a simple one. In his book Differential and Integral Calculus, Edmund Landau defines $\sin x$ using the standard power series mentioned in the post. Then in the famous Landau style (very very carefully, full detail) he defines the cosine function, and establishes the familiar basic properties of sine and cosine. He shows that $\cos x=0$ has a least positive solution, and calls that number $\pi/2$. After all that, showing that $\sin(\pi/2)$ is very easy!

There is an English translation of the Landau's book (Chelsea). Enough of the book is viewable on Google Books to allow reconstructing everything. I have not seen a detailed development of sine and cosine through power series anywhere else, though there must be some. However, Landau's work is of unparallelled clarity.

André Nicolas
  • 507,029
  • So does Rudin in the Prologue of his Real and Complex Analysis. – lhf Apr 19 '11 at 09:22
  • 3
    This is something of a cheat. It is easy to prove the Pythagorean theorem from the series definitions. For me the crux of the issue is to show that this definition of $\pi$ agrees with the usual one. – Qiaochu Yuan Apr 19 '11 at 21:24
  • What do you mean by the usual definition of $\pi$? I'm thinking you mean the ratio of circumference to diameter, which in this context can be written as an integral? – Michael Lugo Apr 20 '11 at 00:04
  • An outline of this approach can also found in Spivak's Calculus. – cch Apr 20 '11 at 07:29
  • and also in Whitaker&Watson's classical modern analysis text. – Mark Jun 30 '11 at 23:00
  • @Mark Schwarzmann: Thanks, for taking me back to good (but incomplete!) memories. – André Nicolas Jun 30 '11 at 23:19
11

This is an elaboration on Eelvex's answer.

The crux of the issue is that $\sin x$ (as defined by the power series above) is the unique solution of $y'' = -y$ satisfying $y(0) = 0, y'(0) = 1$. Similarly, $\cos x$ (as defined by the derivative of the power series of $\sin x$) is the unique solution of $y'' = -y$ satisfying $y(0) = 1, y'(0) = 0$. Moreover, as Eelvex hints at, for any $c$ the function $\sin (x + c)$ is also a solution to $y'' = -y$, and its initial conditions are $y(0) = \sin c, y'(0) = \cos c$, hence

$$\sin (x + c) = \sin c \cos x + \cos c \sin x.$$

Furthermore, we compute that the derivative of $\cos^2 x + \sin^2 x$ is identically zero, and it is equal to $1$ at $x = 0$, hence is identically equal to $1$ everywhere. It follows that $\sin x$ is bounded. Since $\cos x$ is positive for sufficiently small $x$ by inspection, we have that $\sin x$ is at least initially increasing, and by boundedness it attains a local maximum at some positive real $c_0$. This gives $\cos c_0 = 0$, hence $\sin c_0 = 1$ and

$$\sin (x + c_0) = \cos x.$$

The same argument applies to $\cos x$, giving $\cos (x + c_0) = \sin (x + 2c_0) = - \sin x$, hence $\sin (x + 4c_0) = \sin x$, hence $\sin x$ is a periodic function with period $4c_0$.

The remaining mystery is why $4c_0$ is equal to the circumference of the unit circle. Recall that for a parameterized curve $(x(t), y(t))$ with $0 \le t \le t_0$, the arc-length is

$$\int_0^{t_0} \sqrt{x'(t)^2 + y'(t)^2} \, dt.$$

Letting $x(t) = \cos t, y(t) = \sin t$ we have $x'(t)^2 + y'(t)^2 = 1$. Moreover $(x(t), y(t))$ parameterizes the unit circle, and by looking at what quadrant $(\cos t, \sin t)$ is in for $t$ slightly larger than $0, c_0, 2c_0, 3c_0, 4c_0$ we can conclude that it parameterizes the unit circle exactly once precisely when $t_0 = 4c_0$, from which the identity $4c_0 = 2 \pi$ follows.


Note that direct manipulation of the series was not really used here, although the fact that the series solves $y'' = -y$ was used extensively. This is really the crucial property of the series, so it's a more useful thing to work from anyway. The problem with directly manipulating the series is that at some point you have to get rid of the $\pi$. Doing this using geometry is much easier than using series manipulation (which seems unnecessarily hard to me; it's not even clear to me what definition of $\pi$ you could use in this situation that wouldn't be circular), and is more revealing anyway.

Qiaochu Yuan
  • 419,620
7

Using that the given series is alternating for small $x$ you have $\sin(0.7)<0.7<{1\over\sqrt{2}}$ and $\sin(0.8)>0.8-0.512/6>{1\over\sqrt{2}}$. (By estimating the fractions you can verify this without a pocket calculator.) It follows that there is an $\alpha\in\ ]0.7,0.8[\ $ with $\sin\alpha=\cos\alpha={1\over\sqrt{2}}$, whence $\sin(2\alpha)=1$. Now give $2\alpha$ the name ${\pi\over2}$.

4

[ A sketch: ]

Let $f(x) = x - \frac{x^3}{3!} + \frac{x^5}{5!} + \cdots$. You can easily show that

$$\begin{eqnarray} f(x + c) & = & f(c) + f'(c)x + f''(c)\frac{x^2}{2!} + \cdots \quad(1) \\ f''(x) & = & -f(x) \quad(2)\\ f'(x) & = & 1 - \frac{x^2}{2!} + \frac{x^4}{4!} + \cdots \quad(3) \end{eqnarray}$$

Then it is straight forward to prove that for any $c$: $$f(x+c) = f'(c)f(x) + f(c)f'(x) \quad(4)$$

We can also prove that there is a $b$ such that $f'(b) = 0$. Then $(1),(4) \Rightarrow f(b) = 1$ and so $$f(x+b) = f'(x) \quad(5)$$

Going a little further, $(2),(5) \Rightarrow f(x+4b) = f(x)$ so $b = \frac{1}{4}\mathrm{period}$.

But $f^2(x) + f'^2(x) = 1$ (all terms of expansion except the first of $f'(x)$ cancel out) so the period of $f$ is $2\pi$, $$b = \frac{\pi}{2}$$

Eelvex
  • 1,414
  • 2
    Why does $f(x)^2 + f'(x)^2 = 1$ imply that the period of $f$ is $2\pi$? – Qiaochu Yuan Apr 19 '11 at 21:29
  • @Qiaochu: $f(x)^2 + f'(x)^2 = \mathrm{const}$ implies that $f(x), f'(x)$ are orthogonal and since $f''(x) = -f(x)$, $f$ has to have a period of $2\pi$ – Eelvex Apr 19 '11 at 23:36
  • I still do not see how this follows without an argument similar to the argument in my answer. You seem to be implicitly using some statement and I am not sure what that statement is. – Qiaochu Yuan Apr 20 '11 at 04:08
  • @Qiaochu: I'm not saying that a rigorous proof is not required but $f^2(x) + f'^2(x) = c$ is a sufficient condition for $f(x + T) = f(x), \min(T) = 2\pi$. (Well, at least for continuous etc $f$) – Eelvex Apr 20 '11 at 21:21
1

use identities for cos(a+b) and sin(a+b) - which can be derived (Landau as mentioned above). Let a=pi/2 and b= -x

Then apply cos^2 +sin^2 = 1 (can be proved by derivative) twice to get sin(pi/2)=1.

this is off- topic, but may be ok as a proof.

0

[edit: This is a poor answer - it takes for granted that we know sin(pi/2) = 1]

Instead of considering the series about 0, consider it about $ \dfrac{\pi}{2}$. Thus we get the following:

$$ sin(x) = 1 - \dfrac{1}{2} (x - \dfrac{\pi}{2} )^2 + ...$$

But now when you plug in $ \dfrac{\pi}{2}$...

How's that?