Fourier says that any periodic function can be represented like a infinite sum of sine functions with their appropriate periods,amplitudes and phases. My question is: is it possible to represent the periodic function like a sum of other periodic functions like a square wave function or some other shape, or they must be the sine functions?
-
Well you could decompose a function down to infinitesimally thin bars, similar to riemann sum. However, if your basis is not orthogonal (and sinusoidal are orthogonal) it might be computationally difficult to transform them. – Gappy Hilmore Jun 05 '15 at 10:25
-
1You may want to look at this list on wikipedia or into harmonic analysis – Dan Robertson Jun 05 '15 at 10:26
1 Answers
For representing: yes. And it is actually pretty easy.
Let $(\psi_j(x))$ be any sequence of (not-necessarily continuous!) bounded periodic functions that converges uniformly to $\sin(x)$. And let $\phi_j = \psi_{j+1} - \psi_j$. Then it is easy to see that we can do a decomposition of functions relative to $\phi_j$ and its higher frequency rescalings.
In fact, when it comes to periodic square integrable functions, a Fourier transform is nothing more than a representation of the function in a predetermined frame.
You should regard this from the point of view of linear algebra: the Fourier transform is obtained by first finding a basis (which are themselves functions) on the vector space of square integrable functions, and then decomposing vectors (which in this case is itself a function) relative to that basis.
So why do we do Fourier analysis with trigonometric functions? (Rather than $\sin$ and $\cos$, you should think $\exp (in x)$.)
They happen to enjoy some nice properties:
- This basis is orthonormal. Given an orthonormal basis, to find out the coefficient of a vector in that basis we just need to take an inner product. For arbitrary bases or frames, this step becomes much more complicated (you need to formally "invert" the change of basis matrix and apply the inverse matrix to the vector; this inverse matrix can be really difficult to compute).
- This basis are eigenfunctions of the derivative operator (with eigenvalues $in$). This makes the representation very useful for solving differential equations.
It turns out that if you are faced with a mathematical problem where the operator involved is not the usual derivative operator, but something else, you can check to see if some version of the spectral theorem applies. Just like the finite dimensional spectral theorem you may have seen in a course in linear algebra, with the help of the spectral theorem you can pick out a suitable orthonormal basis (which is different from just $\exp (inx)$) which you can use to represent your function.
In short, which basis (or frame) to use for representing functions should be determined by the problem you are trying to solve.
Incidentally, this point of view is very useful in, among other things, developments in physics. Quantum states are essentially these types of basis vectors.

- 73,139