So, I'm not a big expert in this subject but I know $1+2+3...=-\dfrac{1}{12}$ isn't to do with 'real' maths but it's all to do with the zeta function; however I was watching a maths video and the equation:
$$ \frac{x(x+1)}{2} $$
... is actually a perfect equation for the series $1+2+3...$ etc. where $x$ represents $n$ in a series and $y$ is the sum of the series up to $n$. So, you can conclude that:
$$ \sum^{n}_{i=1}1+2+3...=\frac{x(x+1)}{2} $$
However, this is where it gets weird; as you have probably guessed, the roots of the equation is $x=0,-1$ but if I want to find the integral of the roots from $-1$ to $0$ which is under the $x$ axis, I get the following:
$$ \int_{-1}^{0} \frac{x(x+1)}{2}\:dx=-\frac{1}{12} $$
So, my question is why is this the case; what connection is there between the value of the integral under the $x$ axis that this graph has compared to the summation of the series?