Let $f(a)=c$ and $f(b)=d$. Now we start at $x=a$. At this point, if we increased $f$ by a constant rate $\frac{d-c}{b-a}$ for the interval $b-a$, we'd arrive at $f(b)=d$ at $x=b$..
Now if we start with a derivative higher than $\frac{d-c}{b-a}$ at $x=a$ and the derivative remains higher for the whole interval, then we'll obviously find that $f(b)>d$. So if we want $ f(b)=d $ , the derivative must transition to values lower than $\frac{d-c}{b-a}$ at some point in the interval. Since the derivative is continuous, this transition can't be abrupt. So the derivative has to equal $\frac{d-c}{b-a}$ at least at one point.
Similarly, if $f'(x)<\frac{d-c}{b-a}$ at $ x=a$, then the derivative must transition to values greater than $\frac{d-c}{b-a}$ at some point in the interval, for $f(b)$ to be equal to$ d$. So again, the derivative has to equal $\frac{d-c}{b-a}$ at least at one point in the interval.
The Mean Value Theorem for integrals is similar:
Let $A=c(b-a)$, where $ c, b$ and $a$ are constants. Let A also be the area of a continuous function over the interval $(a,b).$ If $f(a)>c$, then it can't remain greater than $ c$ for the whole interval, because then we'll end up with an area higher than $A$. So $f$ has to transition to values lower than $c$ at some point. Since $f$ is continuous, so it must be equal to c at some point.
Is my proof correct?