I'm really struggling on this question. I've thought about it for a long time now but I can't seem to figure it out. I know I am supposed to use the Mean Value Theorem in $[0,h]$ with $h>0$ and use the Mean Value Theorem in $[h,0]$ with $h<0$.
Attempt: I applied the definition of mean value Theorem for both of these and get:
There exists c such that $f'(c)=f(h)-f(0)/h$. There exists d such that $f'(d)=f(0)-f(h)/-h$. $f'(c)$ clearly equals $f'(d)$, which implies c=d=0. Thus f'(0) exists. I'm not sure if this is correct, but how would I conclude it equals L?
I know that there is some theorem that $f\in C[a,b], $ f is differentiable in [a,b). Then if $\lim\limits_{x \to a+} \ f'(x)\in R$, then $L=f'(a)$.
I think then the fact given in the question that $\lim\limits_{x \to 0} \ f'(x)\ =L$, would imply that $f'(0)$ equals L from this theorem, and hence we are finished.
Is this correct? I'm not sure if I did it correctly, but have given my best attempt. Any help would be much welcomed.