I've been reviewing some simple calculus style material but I think my knowledge is incomplete.
Suppose we are trying to calculate $\displaystyle\lim_{x \rightarrow 0} \frac{f(x)}{g(x)}$, where $f,g$ for our purposes are smooth.
A useful approach is to Taylor expand the numerator and denominator in the hope of removing any interfering terms that prevent directly evaluating the limit, example here: Using Taylor Expansion to evaluate limits
What I don't understand about this method is to what order should we expand to? In the example above, the third order expansion of $\sinh(x)$ is taken giving a $O(x^{13})$ term and a second order expansion of $\sin(x)$ is taken to get another $O(x^{13})$ term (after some truncation of higher order terms from the binomial expansion, I think?)
Does this mean I am aiming for something like $\displaystyle\lim_{x \rightarrow 0} \frac{r(x) + O(x^p)}{q(x) + O(x^p)}$ for polynomials $r,q$ and $p \in \mathbb{N}$ is allowed to be anything I want provided I can "match" the asymptotic terms?
In the case of $\displaystyle\lim_{x \rightarrow c}$ the procedure is the same but with Taylor expansions centered around $c$? Out of curiousity can this be extended somehow for$\displaystyle\lim_{x \rightarrow \pm \infty}$?