7

I've seen Sine and Cosine defined as the unique solution to:

$$\begin{align} \frac d{dx} \sin(x) &= \cos(x)\\ \frac d{dx} \cos(x) &= -\sin(x) \end{align}$$

with $\sin(0) = 0$ and $\cos(0) = 1$.

Is there really only one solution to these functions? How can these functions be defined more formally?

Aiden Chow
  • 2,849
  • 10
  • 32
  • 4
    "Is there really only one solution to these functions?" YES, they are. "How can these functions be defined more formally?" For alternative equivalent definitions, see the wikipedia page https://en.wikipedia.org/wiki/Trigonometric_functions – Crostul Jul 10 '21 at 07:09
  • 3
    Per "Calculus, Vol 1, 2nd Ed." (Tom Apostol): Axioms [1] Sine and Cosine functions are defined for all of $\Bbb{R}$. [2] $\cos(0) = \sin(\pi/2) = 1, \cos(\pi) = -1$. [3] $\cos(y-x) = \cos(y)\cos(x) + \sin(y)\sin(x)$. [4] For $\displaystyle 0 < x < \frac{\pi}{2}, ~0 < \cos(x) < \frac{\sin(x)}{x} < \frac{1}{\cos(x)}.$ ...see next comment – user2661923 Jul 10 '21 at 07:13
  • 2
    Apostol then observes: [A] The traditional (i.e. Analytical Geometry) definitions for Sine and Cosine satisfy the premises if you re-define the functions so that the domain of the two functions are Real Numbers rather than angles. [B] All of the normal identities, including derivatives, are derivable from the premises. [C] There is an alternate ground up derivation via the Taylor Series (i.e. make the corresponding Sine and Cosine Taylor Series the definition(s), rather than a derivable result). – user2661923 Jul 10 '21 at 07:17
  • Given that there indeed is a unique solution, this definition is as formal as it can be. – lisyarus Jul 10 '21 at 07:27
  • 1
    Spivak's Calculus contains definitions of $\cos$ and $\sin$ that do not rely on Taylor Series expansion, and that connect nicely with the unit circle definitions but are rigorous without relying on geometric arguments. These come in chapter 15 of his book (3rd ed.). I'd rather not try to reproduce them here in the comments. This is the gist, though the reasoning and motivation behind it may escape you if you haven't read the preceding material upon which it's based: https://www.quora.com/How-does-one-formally-define-sin-x-without-geometric-cheating-or-using-a-series – Ben Jul 10 '21 at 15:43
  • Note also, Spivak notes that $\sin$ and $\cos$ fulfill the differential equations you ask about, but he doesn't use those equations to define $\sin$ and $\cos$. – Ben Jul 10 '21 at 15:50
  • @Ben: On the other hand, if you want to obtain the elementary functions (i.e. exp,cos,sin) on $ℂ$ it seems to me most sensible to start with the Taylor series motivated by the differential equations but proven to actually satisfy them, as explained here. – user21820 Jul 10 '21 at 19:34
  • @user21820 Absolutely. The Spivak definition has less prerequisites, and is limited to $\mathbb{R}$. – Ben Jul 10 '21 at 20:33
  • @Ben: It's been a while since I looked inside Spivak, so I can't recall what exactly he used. But I just want to note that we do not need the general result of termwise differentiation of power series in order to prove the properties of $\exp$. (See the linked post from the above linked post.) So I would say that going straight via the series definition does not actually require much. =) – user21820 Jul 10 '21 at 20:46

3 Answers3

10

Let's decouple the sine and cosine functions by using second derivatives: $\sin x$ is the unique solution of $y''(x) = -y(x)$ where $y(0) = 0$ and $y'(0) = 1$, and $\cos x$ is the unique solution of $y''(x) = -y(x)$ where $y(0) = 1$ and $y'(0) = 0$. I'll take for granted that they fit these conditions and focus on why such equations have just these solutions. In particular, I will not make any use of power series or any kind of infinite series representation at all to get the uniqueness result.

Here is a proof for the sine function. Suppose $y''(x) = -y(x)$ where $y(0) = 0$ and $y'(0) = 1$. To prove $y(x) = \sin x$, let $f(x) = y(x) - \sin x$. We want to show $f(x) = 0$ for all $x$. Differentiating $f$ twice, we get $f''(x) = -f(x)$, $f(0) = 0$, and $f'(0) = 0$. Let's consider the expression $F(x) = f(x)^2 + f'(x)^2$. This turns out to be constant since its derivative is $0$ (this is analogous to $\sin^2x + \cos^2 x$ being constant, namely $1$): $$ F'(x) = 2f(x)f'(x) + 2f'(x)f''(x) = 2f(x)f'(x) - 2f'(x)f(x) = 0. $$ Functions on $\mathbf R$ with derivative $0$ are constant, so $F(x) = c$ for some number $c$ (and all $x$). We have $F(0) = f(0)^2 + f'(0)^2 = 0^2 + 0^2 = 0$, so $c = 0$. Thus $F(x) = 0$ for all $x$, so $$ f(x)^2 + f'(x)^2 = 0. $$ These are real numbers, so $f(x) = 0$ for all $x$. Thus $y(x) = \sin x$. A similar argument proves a more general result mentioned in the answer by xyz: if $y''(x) = -y(x)$ with $y(0) = a$ and $y'(0) = b$ then $y(x) = a\cos x + b\sin x$. The converse is easy (granting we know what $\sin x$ and $\cos x$ are, of course).

I learned this slick argument from Spivak's Calculus. See Theorem 4 of Chapter 15 ("The Trigonometric Functions").

KCd
  • 46,062
  • 1
    I love that you included the book as well! – thegrinderguy Jul 10 '21 at 07:26
  • Are you sure that Spivak is the book that "develops all the basic properties of the sine and cosine functions from the viewpoint of differential equations". If I remember correctly, he introduces sine and cosine with a geometric (area-based) approach. But perhaps I haven't read that chapter in enough detail. – Joe Jul 10 '21 at 07:33
  • @Joe you are correct! I removed that comment from my answer. – KCd Jul 10 '21 at 07:37
  • +1. We also find the derivative of $\sin^2x+\cos^2x$ is $0$ so $\forall x,(\sin^2x+\cos^2x=\sin^20+\cos^20=1).$ From this, further consideration shows $\exists K>0,(\sin K=0\land \cos K=1),$ and hence $\sin (x+K),\cos (x+K)$ are solutions to the ODE. From the uniqueness of the solutions, therefore $\sin$ and $\cos$ are periodic with period $K.$ The least such $K>0$ is now called $2\pi.$ – DanielWainfleet Jul 10 '21 at 08:06
  • 1
    @DanielWainfleet a bit more work would be needed to prove there is a least positive $K$ fitting those conditions. In any case, this is also an approach to proving the addition formulas for sine and cosine, e.g., for each number $c$, $\sin(x + c)$ and $(\sin x)(\cos c) + (\cos x)(\sin c)$ as functions of $x$ both satisfy $f'' = -f$ with equal values at $f(0)$ and $f'(0)$, so they are equal at all $x$. That is how Spivak proves the addition formulas for sine and cosine. – KCd Jul 10 '21 at 08:10
  • 1
    @J.W.Tanner yes. I have made that correction. – KCd Mar 04 '24 at 19:54
2

I would guess that it is the unique solution to $x' = Ax$ where $x(0) =I$ and $A= \begin{bmatrix} 0 & -1 \\ 1 & 0 \end{bmatrix}$.

copper.hat
  • 172,524
2

If your really want a formal definition. You can take their Taylor expansion as their definition.

Moreover, There is only $1$ pair of such functions satisfies your requirements. If $f'=-g, g'=f, f(0)=1,g(0)=0$. Then, we have $f''+f=0$. You can show that the only solution of this equation is $f(x)=A\cos(x)+B\sin(x)$. Similarly, $g'+g=0$ and $g(x)=A'\cos(x)+B'\sin(x)$.

By your additional requirements, we have $g(0)=0$, i.e $A'=0$ and $g(x)=B' \sin(x)$; $f(0)=1$ i.e $A=1$ and $f(x)=\cos(x)+B\sin(x)$. Also since $f'=-g$, $g'=f$, we have $f'(0)=0, g'(0)=1$. Thus, $B=0$ and $B'=1$. Done.

Hint for showing unique solution of $f''+f=0$: Show the derivative of $f(x)\cos(x)-f'(x)\sin(x)$ and the derivative of $f(x)\sin(x)+f'(x)\cos(x)$ are identically $0$

xyz
  • 1,457