0

Is there a way to skip every second term in the power series representation of $\sinh{x}$ and $\cosh{x}$ and adjust the other terms accordingly (approx.)?

So, instead of

$$\sinh{x} \approx x + \frac{x^3}{3!} + \frac{x^5}{5!} + \frac{x^7}{7!} + \frac{x^9}{9!} + ... + \frac{x^{2n+1}}{(2n+1)!}$$ the following $$\sinh{x} \approx a_1 x + a_5\frac{x^5}{5!} + a_9 \frac{x^9}{9!} + ... + a_{4n+1}\frac{x^{4n+1}}{(4n+1)!}$$ and accordingly, instead of $$\cosh{x} \approx 1 + \frac{x^2}{2!} + \frac{x^4}{4!} + \frac{x^6}{6!} + \frac{x^8}{8!} + ... + \frac{x^{2n}}{(2n)!}$$ the following $$\cosh{x} \approx a_0 + a_4\frac{x^4}{4!} + a_8 \frac{x^8}{8!} + ... + a_{4n}\frac{x^{4n}}{(4n)!}$$

The approximations do not need to be steady at $x = 0$

  • $(\sinh x + \sin x)/2$ – Will Jagy May 15 '14 at 20:28
  • @Will -- one of us seems to be interpreting the question wrong, and now you have me wondering whether it's me or you. Yeah, if you use a different function, you can get that power series, but if you are looking for a different way to represent $\sinh x$, no, you can' skip terms without messing up derivatives. – user128390 May 15 '14 at 20:37
  • @user128390, right. I have no idea what the question is about, so I just put this. My dim memory is that you can do any approximation you want on a bounded closed interval and optimize your constants by, say, least squared error or some integral. – Will Jagy May 15 '14 at 20:42
  • @WillJagy I'm looking for a way to do this, say for x = 100...3000, but not numerically. I know how to fit a polynomial the functions by least-squares, but trying to avoid this. – Robert Seifert May 15 '14 at 20:46
  • So you are thinking that the question is about polynomial interpolation or approximation using least squares, not power series and Taylor expansions? – user128390 May 15 '14 at 20:46
  • @user128390, it seems to be about avoiding the standard methods as well. I do not think there is a task in this that can actually be accomplished. – Will Jagy May 15 '14 at 20:51
  • @WillJagy The problem is that I'm not very familiar with what you call "standard methods" - if it's just not possible what I want, it's also a conclusion ;) – Robert Seifert May 15 '14 at 20:53

3 Answers3

3

Series expansions around any particular x=a are unique. Your two examples would represent different expansions around x=0, so this would not work.

If you'd like a bit more of a hands-on explanation, consider that the difference between two different representations of the same function must be the zero function (f(x) = 0 for all x), yet the difference between these two series will be a series with non-zero coefficients, which (pardon my lack of rigour in this statement) will itself be a non-zero function.

2

In your representation, how would you represent the third derivative of $\sinh x$ at $x=0$? The value of this derivative at $x=0$ needs to be 1 because it is $\sinh x$, but your representation gives the value 0. The Maclaurin expansion is the way it is because it has to be that way in order to get the right derivatives -- you can't just pick and choose terms to include in the power series.

  • you're the one who interpreted the question right. I get your point. Assuming x > 100 and the derivative at x = 0 is not important, are there any approximations? – Robert Seifert May 15 '14 at 20:42
  • @thewaywewalk, if you get the book by Abramowitz and Stegun, and their references, there are plenty of approximations over bounded intervals...but $0$ to $100$ with exponential growth seems really optimistic. – Will Jagy May 15 '14 at 20:45
  • The farther $x$ is away from 0 (for Maclaurin, or the center point for a more general Taylor series), the more terms you will need to get anywhere near the right answer. With Maclaurin (center at 0) and looking for values when $x>100$, you will need a HUGE number of terms for any reasonable approximation. – user128390 May 15 '14 at 20:51
  • Well, I'm finally looking for way to approximate $\frac{\tanh,\sqrt[4]{s}}{\sqrt[4]{s}}$ with a rational function $\frac{a_0 + a_1 s + a_2 s^2 + ... + a_n s^n}{b_0 + b_1 s + b_2 s^2 + ... + b_m s^m}$ - there are polynomials which can fit this function with R = 99.9 for x = 0 ... >3000. I just try to collect ideas how I could find the coefficients of the polynomials without numerical error minimization. – Robert Seifert May 15 '14 at 20:51
  • @thewaywewalk, why not consider coefficients obtained through numerical minimization? For example, $$\frac{\tanh s^{1/4}}{s^{1/4}} \approx \frac{0.997273+14.6292 s+3.99571 s^2+0.0408273 s^3}{1+17.0104 s+7.52183 s^2+0.183415 s^3}$$ agrees pretty well for $0 < x < 100$. – Antonio Vargas May 16 '14 at 22:54
  • @AntonioVargas Yes I know, I just hoped to find something similar to this answer which is working for $\tanh{\sqrt{z}}$ but not for $\tanh{\sqrt[4]{z}}$. – Robert Seifert May 17 '14 at 09:01
1

At a particular value of $x$, yes you can. For instance, to approximate $\sinh{(\frac12)}$ by the series $\sum_{n=0}^{N}a_{4n+1}\frac{x^{4n+1}}{(4n+1)!}$, choose $a_1=1$, $a_5=\frac{4\cdot 5}{(1/2)^2}$, $a_{4n+1}=\frac{(4n+1)!}{(4n-1)!(1/2)^2}$, etc. so that the resulting sum is the McLauren series! The problem here is that every time you change the value of $x$ where you are approximating the function, you're going to need calculate the new values for the coefficients $a_k$. The marvelous thing about approximating a function by its McLauren/Taylor series though, is that you only have to find one set of values for coefficients and you can reuse them for any value of $x$ at all (so long as it's within the radius of convergence).

David H
  • 29,921