The proofs of simple versions of the central limit theorem (for instance, for a sample that's drawn iid from some distribution) use techniques involving characteristic functions or moment generating functions, that can be shown using undergraduate real analysis. These can be found in most books (for instance, the book Statistical Inference by Casella and Berger), or on wikipedia. I'll do my best to adapt an argument below. For instance, consider the following central limit theorem, which can be proved using facts from undergraduate real analysis:
Theorem (CLT): Let $X_1, X_2,\ldots$ be a sequence of iid random variables whose moment generating functions exist. Let $E(X_i)=\mu$ and $Var(X_i) =\sigma^2$. Define $\bar{X}_n=(1/n) \sum_{i=1}^n X_i$. Then the random variables defined by $\sqrt{n}(\bar{X}_n-\mu)/\sigma$ converge in distribution to $N(0,1)$.
The proof of this relies on the properties of moment generating functions, $M_X(t)=E(e^{tX}).$ These satisfy three properties that are important:
First:
$$\frac{d^n}{dt^n}M_X(0)=E(X^n),$$
which can be seen by differentiating the Taylor expansion for $M_X$
$$M_X=1+tE(X)+\frac{t^2 E(X^2)}{2!}+\ldots.$$
Second:
If $M_{X_n}(t) \to M_X(t)$, then $X_n$ converges in distribution to $X$. Proving this is the most complicated and technical part of the argument for the above theorem. It can be proved with facts from undergraduate real analysis, but the proof is fairly technical.
Third: If two random variables $X_1$, $X_2$ are independent, then the moment generating function for the $X_1+X_2$ is equal to the product of the moment generating functions for $X_1$ and $X_2$. This follows directly from independence and the definition of the MGF
Once we have these facts, we just need to show that the moment generating functions of $Y_n$ converge to the moment generating function of a $N(0,1)$ random variable. Let $Y_i=(X_i-\mu)/\sigma$. We are trying to find the limiting distribution of $\sum_{i=1}^n Y_i/\sqrt{n}$
$$\begin{align*} &M_{\sum_{i=1}^n Y_i/\sqrt{n}} (t)\\
&=M_{\sum Y_i} (t/\sqrt{n})\\
&=M_{Y_1}(t/\sqrt{n})^n\end{align*}$$
Taking the taylor expansion of this, and then taking the limit as $ n\to \infty$ gives us
$$\lim_{n\to \infty} M_{Y_1}(t/\sqrt{n})^n=\lim_{n \to \infty} [1+\frac{t^2}{2n}+o(t^2/n)]^n \to e^{t^2/2}.$$
and since $e^t=\lim[1+t/n]^n$ and the error goes to $0$ as $n \to \infty$.
With a small amount of complex analysis, we can get rid of the requirement that moment generating functions exist by using characteristic functions instead, which don't have the non-existence issues moment generating functions have. There are many other central limit theorems, that apply in more general settings and require more complicated techniques that are probably a bit beyond undergraduate analysis.