9

As a math major about to go into grad school, I find the algebra-side of mathematics beautiful and inspiring -- I like to explore the hidden structure of things. I also find the geometry/topology interesting as they bring intuition to something we can visualize. Yet I can't feel the beauty of analysis but only its difficulty to visualize and its complicated ad-hoc techniques for manipulating the epsilons.

I couldn't find satisfactory answers online. Most math educators seem to like algebra (or more easy-to-explain thing in general). It just seems hard to tell the big picture of the analysis. On others sites reddit and quora, I have mostly seen evidence for why people love algebra, but little evidence for why people love analysis. (possible exceptions may be Riemannian geometry/complex manifolds but I know very little about the details)

For what it's worth, I am also physics/string theory inclined and know that much of theoretical physics that's hard to experimentally verify is driven by "mathematical appeal". I have not studied much analysis beyond measure theory. I would like to invite experts in analysis or people who have had any inspiring analysis courses to share their excitement. (You are also welcome to share why you hated analysis if you really want to. I just watched 3B1B's monster group video and felt more excited about algebra)

More specifically, can you share results from analysis, which provide a deeper understanding of the underlying structures analysts work with? I am trying to get a sense of the beauty of analysis, and struggling to do so because it all seems very ad-hoc.

Sorry if this question seems too opinion-based. But I believe the answer is illuminating to many rising math students. Technically, this post belongs to "Constructive subjective questions" so it should be reopened. If you also think so, you can vote for "reopen" below.

XYSquared
  • 405
  • 2
    All the development of Convex functions, Gamma Function (with Bohr-Mollerup Theorem included),Stone-Weierstrass theorem about algebras of functions that separate points, Fourier Analysis, Differential (partial) equations, etc., etc., etc. are very beautiful, impressive and useful results. Your mentioning of "epsilons" seems to imply you're focusing on elementary calculus. – DonAntonio Sep 05 '20 at 21:21
  • 3
    Since you asked for experts, I cannot answer your question though I find it amusing that you find analysis difficult to visualize but not algebra. To me, algebra is much harder to visualize; while I can count the Sylow subgroups, and look at center/normalizer stuff, I cannot visualize what that looks like, whereas for many pathological functions in R, you can actually draw out the graph like the Weierstrass function, or the Devil's staircase. Also, would you also say probability is hard to visualize? (since probability and analysis definitely share similar concepts) – E-A Sep 05 '20 at 21:24
  • @DonAntonio I indeed only studied the required elementary-level analysis in US undergrad schools (I am not an analysis person, at least for now, as you can tell). The concepts you mentioned are all very interesting! Do you mind sharing some stories on how you explored them/found them beautiful? – XYSquared Sep 05 '20 at 21:42
  • @E-A haha feel free to share any excitement you have about analysis! I did say geometry/topology is a bit easier to visualize. As for algebra, it is more aesthetically appealing to me instead of easier to visualize. I can't visualize center or inverse etc. but the theory just makes things unified and organized, while examples like the Weierstrass function are more ad-hoc and disordered. – XYSquared Sep 05 '20 at 21:48
  • I admit the concept of "mathematical beauty" is rather controversial, yet I believe it comes from our relentless human desire to ask more and more questions and seek for satisfactory answers. "Why is Weierstrass function something realized so late?" "Could there be something more bizarre than nowhere differentiable but everywhere continuous?" and more general, "why does certain analysis technique just work for some particular proof?" I feel like analysis leaves many of these types of problems unanswered. – XYSquared Sep 05 '20 at 21:51
  • New to this site. Is there a way to make the question less opinion based? Like maybe ask for a specific story on how you found analysis beautiful/interesting? – XYSquared Sep 05 '20 at 22:08
  • 2
    @XYSquared Unfortunately, this isn't really the kind of question which is well-matched for this site. Questions here are meant to be the kind which have authoritative and/or objectively correct answers. Questions which are meant to provoke discussion are explicitly off-topic. Please see the help center for details. – Xander Henderson Sep 05 '20 at 22:21
  • @XanderHenderson I added an objective question at the end of the post (OP should probably confirm that is what they intend though); with that addendum, do you think it could be reopened? I personally enjoyed reading peoples' answers to this question, and I think one can still appreciate the answers (since they are not just "I find functions exciting") – E-A Sep 05 '20 at 22:25
  • 1
    @E-A Again, please refer to the help center (linked above). Quoting that page:

    "To prevent your question from being flagged and possibly removed, avoid asking subjective questions where … every answer is equally valid: 'What’s your favorite ___?'"

    – Xander Henderson Sep 05 '20 at 22:29
  • 1
    @XanderHenderson from your help_center link, I would argue that this post fits the description of "Constructive subjective questions" – XYSquared Sep 05 '20 at 22:43
  • @XanderHenderson I made a Meta.SE post primarily based on this question: https://math.meta.stackexchange.com/questions/32497/what-determines-whether-a-question-is-a-soft-question-or-should-be-closed; it is quite long, but if you can contribute your comments here as an answer there when you get the chance, I (and potentially others in the community) would appreciate it. – E-A Sep 05 '20 at 23:22
  • @XYSquared Ultimately, Math SE is moderated by the community. The line between what is appropriate for this site and what is not appropriate for this site is fuzzy and inexact. In this case, five users (including myself) felt that this question fell on the wrong side of the line. On the other hand, this question already has two reopen votes, so perhaps some other part of the community disagrees. My comments were not meant to "set down the law" regarding site policy, but rather to send a signal to you as to why your question was closed (or, at least, why I voted to close it). – Xander Henderson Sep 05 '20 at 23:38
  • Comparing analysis and algebra is rather subjective. Analysis looks far more natural and even the classical algebra and linear algebra is manageable for me. The modern algebra stuff is rather Greek for me. Results in both analysis and algebra are powerful, deeply interesting and beautiful but the road to algebra seems a lot more bumpy. – Paramanand Singh Sep 06 '20 at 14:32

3 Answers3

6

Of course, it is a matter of taste. However I think it is helpful to understand the history of analysis. I myself really enjoyed the book Mathematics: The Loss of Certainty, by Morris Kline.

In the good old days (pre-19th Century), people did calculus willy-nilly. But it was realized that rigor was required, because contradictory results came up (like several different values for $\sum_{n=1}^\infty \frac{(-1)^{n+1}}n$). So they made it rigorous with the use of $\epsilon$-$\delta$ proofs, and later a rigorous notion of integration. These kinds of proofs are now a rite of passage for anyone who wants to do analysis, before they do something that is applicable (like differential equations, probability theory, and harmonic analysis).

Now some people fall in love with these $\epsilon$-$\delta$ proofs, and these people can go on to study abstract Banach space theory. But other people, like Norbert Wiener, used analysis to develop more interesting and applicable stuff like mathematical Brownian motion (that is, the Wiener process). Indeed I remember a quote by Norbert Wiener where he compared himself to Stefan Banach, but I am unable to locate this quote.

So there is a side to analysis that does have more structure, and in this sense, it does have more of the flavor of algebra.

Stephen Montgomery-Smith
  • 26,430
  • 2
  • 35
  • 64
4

If your definition of "beautiful" is restricted to "structured", I'm afraid you may never find much beauty in analysis. In another post on either MSE or MO, someone cleverly summarized algebra as the art of exploring structure in complicated objects, while analysis is the art of coming up with a very limited set of rules and definitions and seeing what sort of properties emerge.

As a beginning graduate student in analysis, I think these emergent properties can be quite beautiful, even if they are not inherently "structural" facts. I'd like to provide two examples, hopefully accessible at the undergraduate/beginning graduate level:

Partial differential equations: consider the family of harmonic functions: functions $u:U\subset\mathbb{R}^n\to\mathbb{R}$ with the property that $\Delta u := \sum_{i=1}^n u_{x_ix_i}= 0$. It turns out that such functions satisfy the mean value property: if $B(x,r)$ is the ball of radius $r$ centered at any $x$, then $u(x)$ must equal the both the volume average and surface average of $u$ over any such ball: $$u(x) = \frac{1}{|\mu(B(x,r))|}\int_{B(x,r)} u(x)\,dx = \frac{1}{|\partial B(x,r)|}\int_{\partial B(x,r)} u(x)\,dS(x).$$ This mean-value property seems, superficially, quite disconnected to the fact that the sum of all second-order partials of a function must vanish, and yet harmonic functions satisfy this property, among many others. What is perhaps even more fascinating is that the converse holds: any locally $L^1$ function is automatically smooth and harmonic!

Functional analysis: Perhaps a more "structural" example. The Riesz representation theorem is never taught with as much excitement as it should be. Let $H$ be a Hilbert space, endowed with an inner product $\langle \cdot,\cdot\rangle$. Let $f$ be any bounded linear functional on $H$; that is a continuous linear map from $H\to\mathbb{R}$.

If $H$ is just $\mathbb{R}^n$ endowed with the Euclidean norm, then the inner product is just the dot product. What's the only linear way to map a vector $v$ to a real number? At least to me, it is clear that the only way is to take linear combinations of its components, so that any linear functional $f(v)$ takes the form $$f(v) = \sum_{i=1}^n c_iv_i.$$ But this is just the dot product, and so we can say that any bounded linear functional on $\mathbb{R^n}$ takes the form $f(v) = c\cdot v$ for some $c\in H$. This gives us a nice way to, erm, represent the family of all bounded linear functionals on $\mathbb{R}^n$.

But wait, the Riesz Representation Theorem says that this is true for any Hilbert space! That is, any bounded linear functional $f$ on $H$ can be written in the form $f(v) = \langle \phi, f\rangle$ for some $\phi\in H$. The fact that this rather obvious property of $\mathbb{R}^n$ translates to an arbitrary Hilbert space is, at least to myself, quite mysterious and remarkable!

More generally, some analysts (at least myself) find beauty in the connection between abstract, theoretical results in analysis and their ability to represent the real world; people much more intelligent and articulate than myself have written about this at length, notably Eugene Wigner.

Phil
  • 1,078
1

When we start learning analysis, it looks like a mess. Epsilons here, deltas there, monstrous counterexamples, so many different concepts of derivatives, rules for differentiation and integration, one will quickly feel lost. But that's only at the beginning. Once you dive deeper, analysis reveals a lot of structure.

Many of the seemingly distinct concepts can be consolidated

I will give a relatively simple example where the confusing multitude of concepts can be elegantly consolidated into a single concept: the total derivative. Let $V,W$ be Banach spaces. A function $f:V\to W$ is called differentiable at $x_0\in V$ if there exists a continuous linear map $L:V\to W$ called total differential of $f$ at $x_0$ and a remainder map $R:V\to W$ such that

  1. $f(x)=f(x_0)+L(x-x_0)+R(x)$
  2. $\lim\limits_{x\to x_0}\frac{R(x)}{\Vert x-x_0\Vert}=0$

Continuity of $L$ guarantees that the expression in 1. is continuous. Then the first condition says that $f$ can be approximated by a linear function $L$, while the second condition specifies how good this approximation is (The difference between $f$ and its linear approximation $f(x_0)+L(x-x_0)$ goes to $0$ faster than $x-x_0$). If these conditions are fulfilled, then $f$ is totally differentiable at $x_0$, and almost all other concepts of derivatives reduce to this definition:

  1. In single variable analysis, the linear map $L$ is just $L(v)=f'(x_0)\cdot v$.
  2. For scalar fields, $L(v)=\nabla f(x_0)\cdot v$, where $\nabla f(x_0)$ is the gradient of $f$ at $x_0$.
  3. For vector fields, the matrix representation of $L$ is the Jacobian of $f$.
  4. The directional derivative at $x_0$ in the direction $v$ is the total differential of $g(t):=f(x_0+vt)$ at $0$.
  5. The partial derivatives are just the directional derivatives in the direction of the coordinate axes.
  6. The functional derivative of a functional $\mathcal F:F\to\mathbb R$ (where $F$ is a complete function space) is the total differential of $\mathcal F$.

Suddenly all these disparate concepts are just one, and I think that's beautiful.

Analysis is the continuation of algebra

Many algebraic fields study simple geometry. By simple I certainly don't mean easy, but they often take the most ideal scenario possible to study geometry. Linear geometry, for instance, or the geometry of polynomial functions. Analysis takes the ideas of algebra and asks: How flexible can we be if we still want to apply all this idealized algebra? The definition of the total differential above is a prime example. We know a lot about linear geometry, that is the geometry of vector spaces and their (affine) subspaces. Affine subspaces can be defined in terms of affine maps. The map $f(x_0)+L(x-x_0)$ is an affine map. So the definition above is essentially saying that a function is differentiable if it is reasonably close to an affine map, about which we know a lot. This is what leads to some of the biggest theorems in a first analysis course, like the inverse function theorem: A linear equation $L(x)=y$ has a unique solution for all $y$ if $L$ is an injective linear map. Well, a nonlinear equation $f(x)=y$ has a unique solution for all $y$ in a neighborhood of $x_0$ if the total differential of $f$ at $x_0$ is an injective linear map. So analysis essentially takes the concepts of algebra (especially linear algebra) one step further.

Algebra and analysis aren't actually that distinct

I mean, they're certainly different, but they have a lot of intersections. For instance, the fundamental theorem of algebra is one of the standard results of a complex analysis course. And even the proof shown in a standard abstract algebra course relies on analysis, specifically on the fact that real polynomials of odd degree are reducible because they have a root, which is quickly shown using the intermediate value theorem. On the other hand, algebra comes up in complex analysis as well: The total differential of an injective holomorphic function is a member of the conformal group $\operatorname{CO}(\mathbb R^2)$, and the fact that conformal functions are holomorphic can be shown using linear algebra, see here.

Vercassivelaunos
  • 13,226
  • 2
  • 13
  • 41