WARNING: This is long and layman-like, you may have a difficult time withstanding reading this if you consider yourself a seasoned mathematician.
At one point I came across Liouville's theorem of differential algebra, but I don't understand the theorem's proof and basis or really the theorem itself. I've contacted dozens people who specialize in abstract algebra and who've written about it, and to my surprise, they also don't understand either! They literally don't know, what a shame. So, I defer this challenge to the stackexchange network.
So, I'm trying to break it down in terms of what I understand, which is, not an advanced background in abstract algebra, but some experience in real and complex analysis.
$...$
I understand what a field is, you have elements in the field that are closed under $+ - / \cdot$. You can add, multiply, subtract and divide and those elements and still retain some fundamental property. The same is true of rational numbers, if you perform basic operations on rational numbers, the result is another rational number.
Okay, so far so good.
Now, we talk about rational functions, rational functions apparently form a field because you can add them, subtract them, multiply them and divide them and the result is still a rational function! Pretty nifty eh? Well, not only that, but when you differentiate and compose rational functions, the result is still a rational function!
Pretty nice properties, this is stronger than the concept of a field to include composition and differentiation, but, the anti-derivative of a rational function is not always, and usually isn't, another rational function.
$...$
1.) Now, elementary functions includes "algebraic" functions, so I don't understand how Liouville's theorem makes the leap from rational functions to general algebraic functions.
There doesn't seem to be a differential field extension that jumps from rational to algebraic. I don't exactly understand what a field extension is, but if there's one for logarithms, there ought to be one for algebraic functions. Although, maybe as part of some weirdly left-out explanation that probably should have been included, one could write some type of lemma about how any algebraic function is the solution to a polynomial equation, $P(y) = Q(x)$, and thus if given any rational function $\frac{dy}{dx} = P(x,y)/Q(x,y),$ you can always just move the $Q(x,y)$ to the left to make $Q(x,y) \frac{dy}{dx} = P(x,y)$ and maybe somehow that explains something.
2.) So, that's okay, I move on from not understanding what's going on to different articles that try to explain this concept of a "field" extension. It's almost a straight-forward concept just by its name, but the technical nuances require some deconstructing.
If a field exists over a particular set of elements, let's say the set $\mathbb{E},$ then a field extension is some kind of modification where, you add new elements to comprise the field $\mathbb{F}$, and with the addition of those new elements, it is said $\mathbb{E} \subset \mathbb{F}$ where the new $\mathbb{F}$ retains some type of abstract fundamental property. What I think this is saying is, if you add new elements, then any combination of both the original elements and those new elements are also closed under $+ \ - \ \cdot \ /$. So, if for instance you extend to logarithms, then, if you add and subtract and divide and multiply a combination of rational functions and logarithms, the result is still a combination of rational functions and logarithms.
3.) The way this is notated, is, given your ground field or starting field of rational functions $C(x),$ which I suppose represents functions rational in the variable $x$, you extend to $C(x,\ln(x)).$ I still have a little bit of confusion here, because if the field is $C(x),$ the field of rational functions, is this saying the new field extension is all functions rational in logarithms, or, $P(\ln(x))/Q(\ln(x))$? I would think, similar to notation I might see with multivariate polynomials, $C(x,\ln(x))$ is a field of functions that are rational in the variables $x$ and $\ln(x).$ An example would be $\frac{\ln(x)^2 + x - 1}{x^3 - \ln(x)}$.
Okay, I can see that, so a you can add, subtract, divide and multiply any two functions that are rational in combinations of $x,\ln(x)$, functions $a(x,\ln(x))$ and $b(x,\ln(x))$, and the result will be another function $c(x,\ln(x)) \in C(x,\ln(x)),$ or, the result will be another function that is rational in the variables $x, \ln(x).$
Then, continuously without specifying all field extensions made, articles on the subject usually go on by now to state Liouville's theorem, leaving me with many more questions than I originally started with.
$...$
4.) In looking at the structure that the theorem revolves around, that for some element $a$ in the field extension $\mathbb{F},$ that $a = v' + \sum \frac{u'}{u},$ I don't know what each individual $\frac{u'}{u}$ represents or even $v$, it just seems like a random haphazard statement at this point. I'm not saying it is and I already know it isn't, I'm just saying from a fresh outside perspective, that's how it comes across to me. Are $u$ and $v$ just any random elements in the field?
This hints at a kind of limitation in the concept of field extensions, that you can only check whether a result can be written in terms of whatever you've extended to, so if you forget to extend your field to include exponentials, you can't check whether the solution to a differential equation is also comprised of exponentials.
5.) But there's hope. Many articles on the subject don't break down the extra details of this structure, not even the wikipedia article for some reason, so I'll try to muster up what I can myself.
$e^x$ satisfies the differential equation $y' = y.$ Then, $\ln(x)$ satisfies $y' = 1/x$ or otherwise $y' = e^{-y}.$
$\ln(f(x))$ satisfies the differential equation $y'(f(x)) = \frac{f'(x)}{f(x)},$ $e^{f(x)}$ satisfies $ y'(f(x)) = y'(f(x)) \cdot f'(x).$
Well, I don't really see much on the differential relations of general algebraic equations, but maybe somehow in all the compositing and inverting that goes on, different logs and exponentials cancel out to yield various algebraic equations.
So, I look back at the statement of Liouville's theorem and start noticing this pattern that structures resembling the differential relations for $e^x$ and $\ln(x)$ come up, though not any general differential relation for algebraic functions.
I think maybe another haphazard connection that, behind the scenes, when you're actually working with this theorem, there might be a bit of Partial Fraction Decomposition going on that can separate a function into the structure in the theorem, separating some given expression into various $u'/u + v$ components.
6.) Another point is that, instead of saying "derivative" as I've often heard my professors say for however many years, articles sometimes specify this "derivation" operation.
What I understand this to be now is straight-forwarly, an abstraction of the differential operator that distills it down to a couple of its fundamental properties to study it, being linearity $\partial(c_1u+c_2v) = c_1\partial u + c_2\partial v,$ for constants $c_1, c_2 \in \mathbb{R},$ and the product rule $\partial (u \cdot v) = v \partial u + u\partial v.$ Why does this not also include the chain rule? I don't know, I don't make a fuss about it.
I move on to making the connection that somewhere along the way, it's important that elementary functions are closed under differentiation, and that a differential field is just like a field, except with the added condition that its elements are closed also under $\partial,$ for a grand total of closure under $+, -, \cdot, /, \partial.$
For whatever reason however, even though $+$ has an inverse and $ \cdot$ has an inverse operation in a field, $\partial$ does not always have an inverse operation that maps to the field, presumably because, well, a field wasn't originally defined with differentiation in mind so there's bound to be those instances. I'm sure you could define a new field-like mathematical object containing elements that are closed under $+, -, \cdot, /, \partial, \int.$
I don't see a proof that elementary functions are closed under differentiation, I don't think it would be that hard to prove itself, but a lemma would require defining what an elementary function even is in abstract terms, and it's possible some technical definition of an elementary function is precisely that it is a solution to the differential structure stated in Liouville's theorem, but I don't know for sure.
$...$
Can anyone explain how to understand this theorem and provide a couple examples of actually working with it, starting with a function, creating field extensions, and then concluding its integral is or isn't elementary?
EDIT 1: Now, I don't understand why Liouville's theorem itself is as general as it is, but in looking through stack exchange and other papers, I think in practice when imploring the theorem, one creates specific field extensions for specific algebraic operations, such as $C(x,\sqrt{x},\sqrt[3]{x},...)$ and so on. But, I don't know for sure, there might be a shortcut where you can implicitly extend to all algebraic solutions using combinations of polynomials.