I was wondering if anyone can help me with this, if f(x) is a periodic function with period T then it satisfies $$\int_{0}^{T}f(x)dx=\int_{a}^{a+T}f(x)\;dx$$ for all $a \in \Bbb R$. It is clear that this must be true, but if you differentiate both sides with respect to $T$ do you not get $$f(T)=f(T+a)$$ and so because $$f(T+a)=f(a)$$ this implies that $$f(T)=f(a)$$ for all $a \in \Bbb R$, but does this not imply $f$ is constant? I am struggling to understand what is going wrong.
Asked
Active
Viewed 169 times
1
-
2The relation holds only if T is a period of f, not for arbitrary T. Therefore you cannot differentiate with respect to T. – Martin R May 05 '15 at 09:54
-
Related – Caleb Stanford Jul 17 '16 at 16:38
1 Answers
1
The equality $\int_{0}^{T}f(x)dx=\int_{a}^{a+T}f(x)dx$ holds for all $a \in \Bbb R$, but only for values $T$ which are a period of $f$.
The derivative is only defined for functions defined on an interval. So what you actually have proved is that if the set of periods contains an open interval then $f$ is constant.

Martin R
- 113,040