2

I have a question regarding limits.

Recently in a math class, my teacher states that $\frac{\sin x}{x}$ goes to $1$ hence in the case of a $\lim_{x\to 0} \frac{x}{\sin x}$, the answer is $1$.

Why is that so? Shouldn't the answer be $0$ in this case?

weejing
  • 21
  • 1
  • Two important points: $\qquad(1)$ The most important reason for introducing limits in calculus is to deal with cases where the numerator and denominator both approach $0$. If the limit were always $0$ in such cases, then derivatives would always be $0$. $\qquad(2)$ This result and some others like it are correct ONLY if radians are used, rather than degrees or some other units, and that's the major reason why radian measure is important in calculus. The derivative of sine is a constant time cosine, and the constant is $1$ only if radians are used. (If degrees are used, then it's $\pi/180$.) – Michael Hardy Jan 25 '16 at 03:36
  • It might be useful to explain why you think the answer should be 0. – ThisIsNotAnId Jan 25 '16 at 06:24
  • You might draw a chart of x / sin x near zero just using data from a pocket calculator, and then explain what you think the limit is. – gnasher729 Jan 25 '16 at 07:20

4 Answers4

4

hint: $\dfrac{x}{\sin x} = \dfrac{1}{\dfrac{\sin x}{x}}$

DeepSea
  • 77,651
0

Hint: Look up the squeeze theorem or L'Hospital's rule.

clocktower
  • 1,447
  • 3
    L'Hopital's rule gives quick answers but not insight, and what's worse: this limit is used in proving that $(d/dx)\sin x=\cos x$, which is then used when L'Hopital's rule is applied to this problem, so there is a logically fallacious circular argument. $\qquad$ – Michael Hardy Jan 25 '16 at 03:32
  • You're right--I had not considered that here. – clocktower Jan 25 '16 at 03:33
0

There are many ways to go about this. One which may help shed light on "why" the limit tends to 1 is that for $x \approx 0$ $$\sin(x) \approx x.$$ You can see this with the Taylor series expansion of sine.

EDIT: Thanks to JLA for pointing this out.

You may also verify this by graphing the sine function for inputs close to 0 and observing that the graph looks linear with slope 1.

  • 3
    How does one derive the taylor series for sine without already knowing this fact? – JLA Jan 25 '16 at 04:54
  • @JLA To derive this Taylor series you need to know $sin(0) = 0$ and $cos(0) = 1$ and the $n^{\text{th}}$ derivatives of sine (which the OP admittedly probably doesn't know at this point). However, this was mostly for a reasonably accessible theoretical basis for the claim. – ThisIsNotAnId Jan 25 '16 at 06:01
  • @JLA Although I still agree with your concern and I've edited the reply to reflect this. Thank you for pointing it out. – ThisIsNotAnId Jan 25 '16 at 06:18
0

Given what you stated you know, namely that $\lim_{x \to 0} \frac{\sin(x)}{x} = 1$, I think what you are missing is the following property of limits.

If $f$ and $g$ are functions such that $\lim_{x \to a} f(x)$ and $\lim_{x \to a} g(x)$ exist, and moreover $\lim_{x \to a} g(x) \neq 0$ then $$\lim_{x \to a} \frac{f(x)}{g(x)} = \frac{ \lim_{x \to a} f(x)}{\lim_{x \to a} g(x)}.$$

In your case, $g(x) = \frac{\sin(x)}{x}, f(x) = 1$, and $a=0$ (as @Kf_Sansoo pointed out in their answer).

mlg4080
  • 1,805