I am working on a practice problem and there is step in the solution that deals with the application of the mean value theorem (MVT) in a Taylor series. The problem is asking for a condition on $f''(x)$ s.t. $\{(x,y)\in\mathbb{R}:y\ge f(x)\}$ is convex if $f:\mathbb{R}\rightarrow\mathbb{R}$ and $f$ is twice differentiable.
Taking the Taylor series up to the second term and applying $y\ge f(x)$ as is given in the problem statement gives $$y\ge f(x)=f(a)+f'(a)(x-a)+f''(a)\frac{(x-a)^2}{2!}$$ but the solution instead proceeds to the following step instructing the reader to use the MVT $$y\ge f(x)=f(a)+f'(a)(x-a)+f''(a^*)\frac{(x-a)^2}{2!},a^*\in(a,x).$$ This seems to be indicating to me that $f''(a)=f''(a^*)$ but since $a^*\in(a,x)$ not $\in[a,x]$ I don't see how this can be true. My understanding of the MVT simply says that if $f$ is differentiable over $(a,b)$ and cts. over $[a,b]$ then there exists a $f'(c)=\frac{f(b)-f(a)}{b-a},c\in(a,b)$.
Am I incorrect in my understanding of this application of the MVT?