Since the density function $f(x)$ is nonnegative, the integral formula for the expectation is really the difference of two integrals with nonnegative integrands (and hence
nonnegative value):
$$E[X] = \int_{-\infty}^{\infty} xf(x)\mathrm dx
= \int_0^{\infty} xf(x)\mathrm dx - \int_{-\infty}^0 \vert x\vert f(x)\mathrm dx.
$$
When both integrals are finite, their difference is finite too. If one of the
integrals diverges but the other is finite, then some people say
$E[X]$ exists but is unbounded while others deny the existence
of $E[X]$ and say that $E[X]$ is undefined. (Perhaps this is
why many theorems in probability avoid ambiguity by
restricting themselves to random variables
with finite means instead of random variables whose means
exist.) If both integrals
diverge, then the integral formula for $E[X]$ gives a result
of the form $\infty - \infty$ and
everybody agrees that $E[X]$ is undefined.
In summary, if $\int \vert x \vert f(x) dx$ is finite, then
$\int x f(x) dx$ is also finite, and the value of the latter
integral is called the expectation or
expected value or mean of the random variable $X$ and denoted as
$E[X]$, that is,
$$E[X] = \int_{\infty}^{\infty} x f(x) dx.$$
Added Note: To my mind, the difference between saying that
"$E[X] = \int xf(x) dx$ if the integral is finite" (as
Sami wants to) and
"$E[X] = \int xf(x) dx$ if
$\int |x|f(x)\mathrm dx$ is finite"
is that the second statement reminds the
casual reader to check something instead of jumping to
unwarranted conclusions. Many students have mistakenly
calculated that a
Cauchy random variable with density $[\pi(1+x^2)]^{-1}$
has expected value $0$ on the grounds that the integrand $x\cdot[\pi(1+x^2)]^{-1}$
in the integral for $E[X]$ is an odd function, and the integral
is over a interval symmetric about the origin. But they
would have discovered the error of their ways if they had carefully
checked if
$$\int_{-\infty}^{\infty} \vert x \vert \frac{1}{\pi(1+x^2)} dx
= 2 \int_0^{\infty} x\frac{1}{\pi(1+x^2)} dx
$$
is finite.