I recently learned that an infinite series is just the limit as the partial sums of a sequence approach infinity. This idea is expressed as:
$$\sum_{k=0}^\infty a_k \stackrel {def} = \lim_{n \to \infty}s_n$$
However, I intuitively feel that there are some times that an infinite summation appears more "naturally" in math. Namely, by long division, one can observe that $.\overline{3}= \frac13$. Wouldn't this show (or prove) that $$\frac {3}{10}+\frac {3}{100}+\frac {3}{1000}+...=\frac13$$ regardless of the limit definition of an infinite summation? Or, as another example, isn't it true that $$3+\frac {1}{10}+\frac {4}{100}+\frac {1}{1000}+...=\pi$$ To me, the definition of a series seems to define something that already exists in math. I suppose my question is: do you need an infinite series/sequence to define a non-terminating decimal? Are we defining something that is already defined? I'm confused by the train of logic used here.
The only thing I can think of is that long division resulting in a non-terminating decimal isn't defined until you define sequences and series. But from what I can gather, it seems like you can define decimal expansions without series, and that confuses me. Any clarity would be appreciated!