I think the most important role in decomposing into even and odd parts is to make simplifications and reduce calculations.
Integrals:
For example, if you have $f(x)$ a odd function and you have to integrate
$$I = \int_{-a}^{a} f(x) \ dx$$
You know it's equal to zero at once, cause $f$ is odd:
$$I = \int_{-a}^{0} f(x) \ dx + \int_{0}^{a} f(x) \ dx = -\int_{0}^{a}f(y) \ dy + \int_{0}^{a} f(x) \ dx = 0$$
You can do the same for even functions: Instead of computing the integral over two intervals, you can compute twice the same interval:
$$\int_{-a}^{a} g(x) \ dx = 2 \cdot \int_{0}^{a} g(x) \ dx$$
Finding roots, minimum and maximum:
When are dealing with an even function $g(x)$, and you want to find the roots of $g$ at the interval $\left[a, \ b\right]$ with $a < 0 < b$.
Instead of searching in all domain, you can search in $\left[0, \ \max(-a, \ b)\right]$. Once you find a root $r$, then $(-r)$ will also be.
The same happens if you are searching for the minimum or the maximum of a function, which are in fact a problem of finding roots.
Computational graphics:
In computational graphics, if you know there's symmetry in relation to some axes (which is associated with a even function), instead of calculating the color every pixel in both sides, you can only copy and paste, reducing the computational cost.
Linear algebra:
The same idea appears not only in functions. For example, if you have any matrix $A$, you can decompose into a symmetric matrix (related to even) $D$ and a anti-symmetric matrix (related to odd) $W$.
$$A = D + W$$
$$D = \dfrac{1}{2}\left(A + A^{T}\right)$$
$$W = \dfrac{1}{2}\left(A - A^{T}\right)$$
There are some applications like shown here