Given $f:\mathbb{R}^{> 0} \rightarrow \mathbb{R}$ such that:
a) $f$ is $C^2$ in its domain.
b) $f$, $f'$, and $f''$ have finite limits as $x \rightarrow 0^+$
I want to extend $f$ to nonpositive values of $x$ (really just to a neighborhood of the origin) so that it remains $C^2$. Does the following construction work?
Let $\lim_{x\rightarrow 0^+} f = a$, $\lim_{x\rightarrow 0^+} f' = b$, $\lim_{x\rightarrow 0^+} f'' = c$, and define $g(x)$ to be equal to $f(x)$ if $x>0$ and to $a+bx+cx^2/2$ if $x \leq 0$.
In particular, how can I prove (if it is true) that $g$ as defined is continuously differentiable twice at $x=0$?
My motivation (you can ignore this part, but comments on it are welcome)
It seems to me that given a series $\sum_{n=1}^\infty a_n$ that happens to have the property that $a_n$ (currently a function of the positive integers) can be extended to a $C^2$ function $f(x)$ of $\mathbb{R}^{>0}$ at least for sufficiently high $x$, the convergence of the series should be able to be established by the behavior of $f$ and its first two derivatives "at infinity." More precisely, defining $f^*(x)=f(1/x)$, I think that if $f^*$ and its first derivative $\rightarrow 0$ as $x \rightarrow 0$, and its second derivative has a finite limit, then the series converges, and I am trying to prove this.
My idea for a proof is to show that $f^*$ is bounded by something of the form $kx^2$ for sufficiently small positive $x$. In this case, $\sum_{n=1}^\infty a_n = \sum_{n=1}^\infty f^*(1/n)$ would for sufficiently high $n$ be bounded term-by-term by $\sum k/n^2$ which converges. To prove this bound I would like access to Taylor's theorem at the origin, but I don't have this unless $f^*$ is actually twice-differentiable at the origin. This is why I want to create the extension I asked about above. If you have another or simpler route to my desired result (or a counterexample!) I'd be interested too.