8

Consider the integral: $$\int_0^1 \frac{\sin(\pi x)}{1-x} dx$$ I want to do this via power series and obtain an exact solution.

In power series, I have $$\int_0^1 \left( \sum_{n=0}^{\infty} (-1)^n \frac{(\pi x)^{2n+1}}{(2n+1)!} \cdot \sum_{n=0}^{\infty} x^n \right)\,\,dx$$ My question is: how do I multiply these summations together? I have searched online, however, in all cases I found they simply truncated the series and found an approximation.

Many thanks

CAF
  • 2,850
  • 3
    Would you settle for expanding $1/(1-x)$ in a power series, and working out $\int_0^1x^k\sin(\pi x),dx$? – Gerry Myerson Jun 05 '13 at 13:46
  • 4
    $$\left(\sum_{k=0}^\infty a_k \right) \left(\sum_{k=0}^\infty b_k \right) = \sum_{n=0}^\infty \sum_{k=0}^n a_k b_{n-k}$$ – gt6989b Jun 05 '13 at 13:46
  • @GerryMyerson +1, an interesting idea :) – gt6989b Jun 05 '13 at 13:47
  • @gt6989b How did you obtain your result? – CAF Jun 05 '13 at 13:49
  • @GerryMyerson That is the first thing I tried but kept having to do multiple integ. by parts - I'll try again though. Should that be $$\sum_{k=0}^{\infty} \int_0^1 x^k \sin(\pi x) dx?$$ – CAF Jun 05 '13 at 13:52
  • You should be able to find a formula for $\int_0^1x^k\sin(\pi x),dx$ by induction. – Gerry Myerson Jun 05 '13 at 13:59
  • @CAF please see the answer i wrote up... – gt6989b Jun 05 '13 at 14:04
  • Let $$I = \int_0^1 x^n \sin(\pi x) dx$$ By using integration by parts a first time, I obtain: $$I = \left[(\sin(\pi x) \frac{x^{n+1}}{n+1}\right]_0^1 - \frac{\pi}{n+1} \int_0^1 \cos(\pi x) x^{n+1} dx$$ Again, I get $$\left[ \frac{x^{n+1}}{\pi} \sin(\pi x) \right]_0^1 - \frac{n+1}{\pi} \int_0^1 x^n \sin(\pi x) dx$$The part in brackets is zero and the integral is $I$ so then: $$I = -\frac{n+1}{\pi}I$$ which doesn't make sense. – CAF Jun 05 '13 at 14:09

1 Answers1

10

Let's take a more abstract case, trying to multiply $\sum_{k=0}^\infty a_n$ and $\sum_{k=0}^\infty b_n$. Note that In the resulting sum, we will have $a_i b_j$ for all possibilities of $i,j \in \mathbb{N}$.

One way to make it compact is to sum across diagonals. Think about an integer lattice in the first quadrant of $\mathbb{R}^2$. Drawing diagonals (origin, then along $x+y=1$ then along $x+y=2$, etc), note that the one along the line $x+y=n$ will have length $n+1$ integer points, and the sum of the indices along all points there will be $n$ - i.e. $(n,0),(n-1,1),\ldots,(k,n-k)\ldots,(0,n)$. So we can renumber the summation based on these diagonals, getting

$$ \left(\sum_{k=0}^\infty a_n\right) \left(\sum_{k=0}^\infty b_n \right) = \sum_{n=0}^\infty \sum_{j,k\text{ along } x+y=n} a_k b_j = \sum_{n=0}^\infty \sum_{k=0}^n a_k b_{n-k}. $$

gt6989b
  • 54,422
  • That's nice - but how do you prove it algebraically? – CAF Jun 05 '13 at 14:19
  • @CAF I don't understand. Each $a_ib_j$ term is counted in both sides exactly once, and there are no other terms present. – gt6989b Jun 05 '13 at 15:16
  • Sorry, I meant how do you prove it without resorting to the above geometric reasoning with the equidistant straight lines. – CAF Jun 05 '13 at 15:32
  • 2
    @CAF Algebraically, you count the terms in both the left-hand side and the right-hand side, noting that the LHS has $a_i b_j$ exactly once for each distinct $i,j$, and no other terms. Similarly, the RHS will include exactly one $a_i b_j$ term for each unique pair $(i,j)$. This is easy to see. If $i+j = N$, it's not possible to find such a term in any element of the outer sum that is not $N$. And in that summation, it will get used exactly once for each $(i,j)$. – gt6989b Jun 05 '13 at 18:42