In the vector space of $f:\mathbb R \to \mathbb R$, how do I prove that functions $\sin(x)$ and $\cos(x)$ are linearly independent. By def., two elements of a vector space are linearly independent if $0 = a\cos(x) + b\sin(x)$ implies that $a=b=0$, but how can I formalize that? Giving $x$ different values? Thanks in advance.
11 Answers
Hint: If $a\cos(x)+b\sin(x)=0$ for all $x\in\mathbb{R}$ then it is especially true for $x=0,\frac{\pi}{2}$

- 23,150
-
But linear independence can't be deduced only from $x=0$ since $b$ could be any value, the same if $x=\frac{\pi}{2}$ for $a$. What am I missing? – V. Galerkin Jan 03 '13 at 12:20
-
17@V.Galerkin: Why are you taking them one at a time?? $a\cos0+b\sin0=0$ and $a\cos\frac\pi2+b\sin\frac\pi2=0$ simultaneously. – Jan 03 '13 at 12:24
-
3
There are easier ways - eg take special values for $x$. The following technique is a sledgehammer in this case, but a useful one to have around.
Suppose you have $a$ and $b$ as required. Let $r=\sqrt{a^2+b^2}$ and $\phi = \arctan {\frac a b}$ (take $\phi=\frac {\pi} 2$ if $b=0$). Then we have: $$a \cos (x)+b\sin(x)=r\sin(\phi)\cos(x)+r\cos(\phi)\sin(x)=r\sin(x+\phi)$$
The last form is identically zero only if $r=0$, which immediately implies $a=b=0$ from the definition of $r$.

- 100,194
-
Very nice! This is how you switch between trigonometric polynomials and phasors, right? – Giuseppe Negro Jan 03 '13 at 13:44
One way to show linear independence is to use the Wronskian of $f$ and $g$, denoted by $$W(f,g)(x):=\begin{vmatrix}f(x) & g(x)\\ f'(x) & g'(x)\end{vmatrix}=f(x)g'(x)-g(x)f'(x), \quad x\in I.$$
There is a classic theorem which says that if $f,g$ are differentiable on $I$ and $W(f,g)(x_0)\not=0$ for some $x_0\in I$, then $f$ and $g$ are linearly independent on $I$.
So, in your case, $$W(\sin x,\cos x)=\sin x\cdot (-\sin x)-\cos x\cdot \cos x=-\sin^2 x-\cos^2 x=-1\not=0 \text{ for any }I,$$ so $\sin x$ and $\cos x$ are linearly independent on any interval $I$.

- 14,392
-
I don't think the converse you state is a converse, is it not the contrapositive? – Jan 03 '13 at 12:01
-
This is an infinite-dimensional linear problem, because the space $\mathbb{R}^\mathbb{R}$ of real functions of a real variable has not finite dimension. Therefore you cannot use the typical tools of linear algebra: matrices, determinants and the like. To overcome this difficulty you can try sampling the two functions at some points (nodes, to use a more technical jargon) to be determined. Indeed we can prove the following proposition.
Proposition. Let $f_1\ldots f_n \colon \mathbb{R}\to \mathbb{R}$. Suppose that there exist points $(x_1\ldots x_n)$ such that \begin{align} &[f_1(x_1),&f_2(x_1),&\ldots& f_n(x_1)] \\ &[f_1(x_2),&f_2(x_2),& \ldots& f_n(x_2)] \\ &&&\vdots&\\ &[f_1(x_n),&f_2(x_n),&\ldots& f_n(x_n)] \end{align} are linearly independent vectors in $\mathbb{R}^n$. Then the functions $f_1\ldots f_n$ are linearly independent.
Proof If the coefficients $a_1\ldots a_n\in \mathbb{R}$ are such that $$\tag{1}a_1 f_1+\ldots +a_nf_n=0, $$ then evaluating (1) at $x_1\ldots x_n$ we have \begin{align} a_1f_1(x_1) +\ldots +a_nf_n(x_1)&=0 \\ &\vdots\\ a_1f_1(x_n)+\ldots +a_nf_n(x_n)&=0 \end{align} which can be rewritten in matrix form \begin{equation} \begin{bmatrix} f_1(x_1) & \ldots & f_n(x_1) \\ \ldots& \ldots &\ldots \\ f_1(x_n) & \ldots & f_n(x_n) \end{bmatrix} \begin{bmatrix} a_1\\ \vdots \\ a_n \end{bmatrix} =0. \end{equation} By assumption the coefficient matrix is nonsingular and so this equation only has the trivial solution $(a_1\ldots a_n)=(0\ldots 0)$. This proves that $f_1\ldots f_n$ are linearly independent. $\square$
For the case at hand, we have two functions so $n=2$. Choosing nodes $x_1=0, x_2=\pi/2$ we get the sampled vectors $(1, 0), (0, 1)$, which clearly are linearly independent.

- 32,319
-
Is the converse also true? I mean, if $f_1,...f_n$ are linearly independent, then there exist points $x_1,...x_n$ such that the matrix you gave is nonsingular? – learner Feb 17 '17 at 15:38
-
@D... : it is false. Take for example the two linearly independent functions $f_1(x)= x$ and $f_2(x)=|x|$ – Giuseppe Negro Feb 19 '17 at 07:28
-
why do you say so? Take, for instance, $x_1 = -1$ and $x_2 = 1$. Then, the matrix becomes $\begin{bmatrix} f_1(x_1) & f_2(x_1) \ f_1(x_2) & f_2(x_2) \end{bmatrix} = \begin{bmatrix} -1 & 1 \ 1 & 1 \end{bmatrix} $which clearly has rank $2$. – learner Feb 20 '17 at 10:09
-
1@D...: There is only a misinterpretation of the statement from my part. What I meant is that, even if $f_1\ldots f_n$ are linearly independent, there might exist nodes $x_1\ldots x_n$ such that the matrix $\begin{bmatrix} f_i(x_j)\end{bmatrix}{1\le i,j\le n}$ is singular. Sorry about that. Concerning the statement of your previous comment, it should be true but frankly I cannot think of a proof off the top of my head. (It is indeed a kind of [determinantal rank](https://en.wikipedia.org/wiki/Rank(linear_algebra)#Alternative_definitions) (to quote Wikipedia)). – Giuseppe Negro Feb 20 '17 at 12:24
-
1No problem and thank you for the update! I have actually created a separate topic with this question (link below) and someone else has come with a proof that seems right to me. I'd be glad if you take a look and share your thoughts.
http://math.stackexchange.com/questions/2148972/matrix-rank-and-linear-independence-of-functions/2149041?noredirect=1#comment4420753_2149041
– learner Feb 21 '17 at 00:33
$$b\sin(x)=-a\cos(x)$$ The left side is odd, the right side is even, thus they cannot be equal for all $x$, as if the functions are equal for (say) positive $x$, they must be equal in magnitude but with opposite signs for negative $x$, unless $a=b=0$ (this is true in general for all pairs of nonzero even and odd functions).

- 6,353
The vector space you mention is $V = \{f : \mathbb R \to \mathbb R\}$. The elements $\sin x, \cos x$ also belong to the subspace $U = \{f : \mathbb R \to \mathbb R \;|\; \text{$f$ continuous, $2\pi$-periodic}\} \subset V$, which we give the usual inner-product $\langle f, g \rangle = \int_0^{2\pi} f g$. Using a double-angle identity and the fact that $\sin x$ is an odd function, $$ \langle \sin x, \cos x \rangle = \int_0^{2\pi} \sin(x) \cos(x) \, dx = \frac{1}{2} \int_0^{2\pi} \sin(2x) \, dx = 0, $$ so $\sin x$ and $\cos x$ are orthogonal, therefore linearly independent, in $U$ and hence also in $V$.

- 4,937
Does this work?
Assume that $\sin(x)$ and $\cos(x)$ are linearly dependent. Then one must be a scalar multiple of the other, that is
$$\sin(x) = a * \cos(x)$$
But for $x=\pi/2$, we have
$$1 = a * 0$$
which is impossible. Thus $\sin(x)$ and $\cos(x)$ are linearly independent.
I have a feeling that there is some error in my reasoning here, please point it out if you see one :)

- 697

- 11
-
You are correct. You might also want to check this question https://math.stackexchange.com/questions/269668/linear-independence-of-sinx-and-cosx – popoolmica Nov 08 '20 at 21:26
Although I'm not confident about this, maybe you can use power series for $\sin x$ and $\cos x$? I'm working on a similar exercise but mine has restricted both functions on the interval $[0,1]$.

- 11
-
@wordsthatendinGRY Isn't it giving a hint for the answer? And the user can't comment yet. – M. Vinay Jun 07 '14 at 01:57
I think that you should verify $a\cos(x)+b\sin(x)=0$ for every $x$ implies $$a=b=0$$

- 67,374

- 1,056
-
-
$cos(x)$ and $sin(x)$ cannot be both equal to zero at the same time, so it implies that $a=b=0$ – Alan Simonin Jan 03 '13 at 11:37
-
@V. Galerkin: FYI, a neat way to see why Alan Simonin's comment is true ($\cos x$ and $\sin x$ cannot both be zero for the same value of $x$) is that if it wasn't true, then the identity ${\cos}^{2}x + {\sin}^{2}x = 1$ would not hold for that value of $x.$ – Dave L. Renfro Jan 03 '13 at 16:03
-
1@Alan Simonin: It seems to me more is needed to justify $a=b=0$ than the fact that $\cos x$ and $\sin x$ cannot both be zero for the same value of $x.$ For example, consider $(a \cdot 3)+(b \cdot 6) = 0,$ where $3$ and $6$ are considered as constant functions of $x.$ In this example, $3$ and $6$ cannot both be zero for the same value of $x$ (since neither can be zero for any value of $x$), but $a=6$ and $b = -3$ shows that $a = b = 0$ does not automatically follow. – Dave L. Renfro Jan 03 '13 at 16:18
-
@DaveL.Renfro Yes, but in this case, 3 and 6 are not linearly independant – Alan Simonin Jan 03 '13 at 17:24
-
1@Alan Simonin: I realize $3$ and $6$ are not linearly independent, but I don't see the relevance here. To prove the linear independence of $\cos x$ and $\sin x,$ we don't want to use the fact that they're linearly independent to prove they're linearly independent. – Dave L. Renfro Jan 03 '13 at 17:51
If Cos(x) and sin(x) are linearly independent, then this implies.
acos(x)+bsin(x)=0 if and only if (for all values of x) a=b=0.
But!! For x=pi/4 rad; cos(pi/4)=sin(pi/4)= sqrt(2)/2 and for a=1 and b=-1 we get: (1)*sqrt(2)/2 +(-1)*sqrt(2)/2 which is sqrt(2)/2-sqrt(2)/2=0. Therefore: cos(x) and sin(x) cannot be linearly independent.
-
I know what I said is false, cos(x) and sin(x) are actually linearly independent. But using the classical method gives me this seeming anomaly. I don't have problems with polynomials. Hmmm – Geoff Ross Feb 27 '15 at 04:00
-
1When working with a vector space of functions, such as $\mathcal{F}(\mathbb{R},\mathbb{R})$, you must show $a\cos(x) + b\sin(x) = 0$ if $a=b=0$ for all $x$. It doesn't matter if there's an $x$ where your idea fits, you must show that it works for all $x$ only when the scalars are all $0$. – Sep 28 '17 at 02:33