The Fourier series only exists for periodic functions which are integrable over a period. You can choose an interval and consider the periodic extension of $\frac{1}{x}$ over that interval, but if that interval contains $0$ (even as an endpoint), it will not be integrable.
At risk of cluttering what was a simple and to-the-point answer, I'd like to address some of the comments this answer has gotten over the years.
It's important to note that, when I say Fourier series, I mean the full Fourier series (not sine or cosine series), and without making use of the Cauchy principal value. For such a series, all the coefficients existing is a necessary condition for the series existing, and since $a_0 \propto \int_L f$, integrability of the function over the chosen period is a necessary condition for the series to exist.
This can be circumvented by using the above modifications. The Cauchy principal value allows odd functions with a singularity at $0$ to retain the property that integrals over a symmetric intervals vanish, so all the $a_n$ coefficients vanish as a result. The sine series, on the other hand, just assumes from the outset that the $a_n$ vanish, with the trade-off of being a series for the odd-periodic extension of $f$.
(Despite the fact that a sine series is a Fourier series, I maintain the distinction in this case because it does not arise from the standard coefficient-based definition of Fourier series without invoking the CPV. You are welcome to disagree, as this is purely a semantic distinction.)
As for square-integrability, that is a sufficient condition, but not necessary for the convergence of a Fourier series, unless you limit "convergence" to mean "convergence in the $L^2$ norm". Different types of convergence have different conditions, and square-integrability is not necessary for all of them; for example, one can construct a non-square-integrable function which has a Fourier series that is pointwise convergent almost everywhere.