130

Quite often, mathematics students become surprised by the fact that for a mathematician, the term “logarithm” and the expression $\log$ nearly always mean natural logarithm instead of the common logarithm. Because of that, I have been gathering examples of problems whose statement have nothing to do with logarithms (or the exponential function), but whose solution does involve natural logarithms. The goal is, of course, to make the students see how natural the natural logarithms really are. Here are some of these problems:

  1. The sum of the series $1-\frac12+\frac13-\frac14+\cdots$ is $\log2$.
  2. If $x\in(0,+\infty)$, then $\lim_{n\to\infty}n\bigl(\sqrt[n]x-1\bigr)=\log x$.
  3. What's the average distance from a point of a square with the side of length $1$ to the center of the square? The question is ambiguous. Is the square a line or a two-dimensional region? In the first case, the answer is $\frac14\bigl(\sqrt2+\log\bigl(1+\sqrt2\bigr)\bigr)$; in the second case, the answer is smaller (of course): $\frac16\bigl(\sqrt2+\log\bigl(1+\sqrt2\bigr)\bigr)$.
  4. The length of an arc of a parabola can be expressed using logarithms.
  5. The area below an arc of the hyperbola $y=\frac1x$ (and above the $x$-axis) can be expressed using natural logarithms.
  6. Suppose that there is an urn with $n$ different coupons, from which coupons are being collected, equally likely, with replacement. How many coupons do you expect you need to draw (with replacement) before having drawn each coupon at least once? The answer is about $n\log(n)+\gamma n+\frac12$, where $\gamma$ is the Euler–Mascheroni constant.
  7. For each $n\in\mathbb N$, let $P_p(n)$ be the number of primitive Pythagorean triples whose perimeter is smaller than $n$. Then $\displaystyle P_p(n)\sim\frac{n\log2}{\pi^2}$. (By the way, this is also an unexpected use of $\pi$.)

Could you please suggest some more?

YuiTo Cheng
  • 4,705
  • 20
    I always thought $\log_{10}$ was written $\operatorname{lg}$ ... – Hagen von Eitzen Jul 03 '17 at 13:20
  • 13
    For me, $log = ln$ seems to be common for people working in analysis. If you are looking, for example, at computer science, you would almost always have $log = ld$ - and I assume there are also examples of fields where $log = lg$ is the most common or where $log$ is not specified at all (e.g. "logarithmic scale", "logarithmic running time"). Mathematics is a very wide area, stretching out into many other fields; and I think you can't simply state that $log = ln$ is most common for all these fields. – Dirk Jul 03 '17 at 13:32
  • 1
    Somehow related: https://math.stackexchange.com/questions/1646042 – Watson Jul 03 '17 at 13:56
  • 14
    The usage in the English world is really surprising. Why don't you just write $ln$ for the natural logarithm? It's even shorter. I would use $log$ only if I want to specify other base or if the base doesn't matter at all. – Džuris Jul 03 '17 at 16:51
  • 34
    As an aside, in computer science, $\log_2$ is the most prevalent logarithm. –  Jul 03 '17 at 17:07
  • 2
    @Džuris Since English is not my first language (or even my second one), it would be uncomfortable for me te discuss its subtleties. However, I am still able to read easily English textbooks. And I can see that Spivak (in his Calculus), Rudin (in his Principles of Mathematical Analysis, aka Baby Rudin), and Apostol (in his Calculus) also use $\log$ and not $\ln$. So, use $\log$ and $\ln$ in any way you want, but I shall use $\log$, just as those three matematicians do. – José Carlos Santos Jul 03 '17 at 17:14
  • 19
    @HagenvonEitzen You're introducing more confusion :-) $\log_{10}$ is never written $\lg$ AFAIK; the notation $\lg$ is used for $\log_2$. – ShreevatsaR Jul 04 '17 at 18:40
  • 4
    In all the math, chemistry and physics classes that I remember $\text{log}$ meant the common logarithm (to base 10). For the natural logarithm (base e) $\text{ln}$ was used. No doubt there may be other standards... – MaxW Jul 04 '17 at 19:48
  • 1
    From a mathematician's perspective, and given the large number of good answers below, one could pose the opposite question: why would a logarithm in any other base be useful? $\log_2$ is used in complexity and information theory, mostly because halving is a useful operation, and $\log_{10}$ is used in Benford's law, and multiplication if you want do multiplication and have the digits be nice in base 10. Of course the advent of pocket calculators has rendered the latter irrelevant. But is that it? I suspect it might be. – Chappers Jul 04 '17 at 20:43
  • 2
    This is also akin to question of why mathematicians use radians: they happen to be the natural unit for measuring angles that gives you nice calculus properties, as opposed to the arbitrary sexigesimology of degrees. Equally, the "natural" base for logarithms is $e$. – Chappers Jul 04 '17 at 20:43
  • 1
    And then, of course, logarithm in base $e$ becomes the primary function, with its scaled cousins $\log_a$, and so gets the distinction of being the logarithm, $\log$. – Chappers Jul 04 '17 at 20:46
  • @ShreevatsaR In the A-level maths syllabus I did many years ago, $\log_{10}$ was written $\mathrm{lg}$. – Patrick Stevens Jul 05 '17 at 14:15
  • 1
    Of course, the common logarithm is "common" only to the extent that for most purposes we have settled on the convention of expressing numbers in base ten. That's a somewhat arbitrary choice, based, it seems, more on biology than on any mathematical consideration. Perhaps your students would benefit from being challenged to reason about the circumstances under which the common logarithm might ever be expected to arise naturally. – John Bollinger Jul 05 '17 at 18:48
  • 1
    Some would just say that no. 5 pretty much covers it, i.e. $\log t$ is defined as $\int_1^t 1/x; dx$ and the latter is a really common thing. – Thompson Jul 06 '17 at 12:37
  • @ShreevatsaR ISO 31-11 stipulates lg as the abbreviation for base 10 logarithms. Using lg as a base-agnostic logarithm (since all log functions have the same asymptotic growth to within a constant factor) is common in computer science, though. – chepner Jul 06 '17 at 14:41
  • 1
    Robbins' constant is the average distance between two points selected at random within a unit cube, and some $\ln( \cdots )$ appears in it! – Watson Jul 07 '17 at 08:48
  • http://https://en.wikipedia.org/wiki/Divergence_of_the_sum_of_the_reciprocals_of_the_primes – serg_1 Jul 19 '17 at 06:52
  • @serg_1 Your link doesn't work. I suppose that you meant this. – José Carlos Santos Jul 19 '17 at 06:56
  • A physics problem of stacking blocks https://ocw.mit.edu/courses/mathematics/18-01sc-single-variable-calculus-fall-2010/unit-5-exploring-the-infinite/part-b-taylor-series/session-96-stacking-blocks/MIT18_01SCF10_Ses96b.pdf shows the answer of how far you can stack the $n$ blocks is asymptotic to some multiple of the natural log. That shows that "naturally" when you stack blocks the shape you get resembles $c\ln n$, which I found to be interesting. – Ahmed S. Attaalla Jul 19 '17 at 17:16
  • @Anixx I didn't invent the expression. It's a standard one. – José Carlos Santos Mar 28 '21 at 16:13

26 Answers26

110

What about the Prime Number Theorem? The number of primes smaller than $x$ is denoted by $\pi (x)$ and you have $$\pi (x) \sim \frac{x}{\log x}$$

Crostul
  • 36,738
  • 4
  • 36
  • 72
  • 2
    Is that meant to be a proportionality symbol? – minseong Jul 04 '17 at 06:27
  • 17
    The symbol means "in limit to infinity" here, no proportionality involved. The logarithm should actually be a natural logarithm, making this a great answer to the question. – tomsmeding Jul 04 '17 at 07:30
  • 5
    Agreed with answer and tomsmeding. This is asymptotic equivalence, not proportionality, and it is the natural logarithm that is used here. – Meni Rosenfeld Jul 04 '17 at 10:29
  • 27
    I've always found it more enlightening to write the asymptotic relation in the Prime Number Theorem in the form $$\frac{\pi(x)}{x} \sim \frac{1}{\log x},$$ because now the fraction on the left is the proportion of primes among numbers up to $x$. – murray Jul 05 '17 at 16:37
  • 6
    @murray: The one I always remember (I keep forgetting what π(x) even is, I'm not a mathematician) is that the nth prime tends to n log n. It's short and the one I'd probably care about the most. – user541686 Jul 06 '17 at 01:30
  • Technically, $\pi(x)$ refers to the number of primes smaller than or equal to $x$. – Joe Oct 27 '23 at 10:21
53

Your first point can be generalized. Write $[a_1,a_2,a_3,\dots]$ for $\sum a_n/n$. You wrote:$$[\overline{1,-1}]=\ln2.$$(The bar means repeat.) Then we also have:\begin{align}[\overline{1,1,-2}]&=\ln3,\\ [\overline{1,1,1,-3}]&=\ln4,\end{align}and in general:$$[\overline{\underbrace{1,1,\dots,1}_{n-1},1-n}]=\ln n.$$


As a side note, one can see that $\ln m+\ln n=\ln mn$ from this. For example, note that, from the definition, we have $[\overline{0,2,0,-2}]=[\overline{1,-1}]=\ln2$ (from doubling the numerators and denominators). We then have:\begin{align}\ln2+\ln2={}&[\overline{1,-1,1,-1}]+\\&[\overline{0,2,0,-2}]\\{}=&[\overline{1,1,1,-3}]=\ln4\end{align} Similarly: \begin{align}\ln2+\ln3={}&[\overline{0,0,3,0,0,-3}]+\\&[\overline{1,1,-2,1,1,-2}]\\{}=&[\overline{1,1,1,1,1,-5}]=\ln6\end{align}

51

Here are some of my favorites:

  • By "reversing" Euler's identity, $$\ln(\cos x+i\sin x)=ix$$

  • The natural log appears in some of the integrals of trigonometric functions: $$\int \tan (x) dx=\ln(\sec(x))+C$$ $$\int \cot (x) dx=\ln(\sin(x))+C$$ $$\int \sec (x) dx=\ln(\sec(x)+\tan(x))+C$$

  • The appearance of the natural logarithm in the Tsiolkovsky rocket equation: $$\Delta v=v_e\ln\frac{m_0}{m_f}$$

Joe
  • 19,636
Franklin Pezzuti Dyer
  • 39,754
  • 9
  • 73
  • 166
  • 12
    Be careful with the logarithms, $\ln(\cos x+i\sin x)$ has a period of $2\pi$, whilst $ix$ is not periodic at all. – Simply Beautiful Art Jul 07 '17 at 13:01
  • 12
    It happens to be that $e^z$ is not a bijective function on $\mathbb C$. Indeed, $e^z=e^{z+2\pi i}$. One usually fixes this with what is called the principal value. – Simply Beautiful Art Jul 07 '17 at 13:23
  • 2
    @FranklinPezzutiDyer: It would be better if you can edit you answer incorporating the suggestion by Simply Beautiful Art. Also, it made me sad that you didn't mention the integral of $\csc x.$ – Bumblebee Dec 06 '20 at 01:41
36

The continuous solution of the functional equation $f(x\cdot y)=f(x)+f(y)$, with the condition $f'(1)=1$ is $f(x)=\ln (x)$.

Changing the value of $f'(1)$ we find the other logarithm functions.

Emilio Novati
  • 62,675
34

Here's another one related to some of your examples: the $n$-th harmonic number

$$ H_n = 1 + \frac{1}{2} + \ldots + \frac{1}{n} $$

satisfies

$$ H_n \approx \ln(n) + \gamma $$

where $\gamma$ is the Euler-Mascheroni constant. The error in the above approximation is slightly less than $\frac{1}{2n}$.

  • 7
    Another good one is that $\sum_{p\le n}p^{-1} \sim \ln \ln n$, where $p$ is prime and $n$ is an integer. – eyeballfrog Jul 03 '17 at 22:17
  • 2
    Notably, the coupon collection problem mentioned by the OP, is closely related to these harmonic numbers. The approximate solution he gave is simply the first few terms in the expansion of $H_n$, multiplied by $n$. – Meni Rosenfeld Jul 04 '17 at 10:30
  • 7
    This looks closely related to "the antiderivative of $x^{-1}$ is $\ln x$." – Kevin Jul 04 '17 at 17:29
  • 1
    As a consequence, $\lim_{k\to\infty}H_{nk}-H_k=\ln n$. (EDIT: Also, $H_{2n}-H_n$ is the partial sum of $\sum(-1)^{n+1}/n$, which means that OP's first fact is implied by your fact.) – Akiva Weinberger Jul 05 '17 at 19:14
26

Using $\sigma(n)$ as the sum of the (positive) divisors of a natural number $n,$ we have $$ \sigma(n) \leq e^\gamma \, n \, \log \log n + \frac{0.64821364942... \; n}{\log \log n},$$ with the constant in the numerator giving equality for $n=12.$ Here $\gamma = \lim H_n - \log n.$

As suggested by Oscar, we may write this without approximations as $$ \sigma(n) \leq e^\gamma \, n \, \log \log n + \frac{ n \; ( \log \log 12) \left(\frac{7}{3} -e^\gamma \,\log \log 12 \right)}{\log \log n}.$$

There are some numbers up to $n \leq 5040 \;$ (such as $n=12$) for which $ \sigma(n) > e^\gamma \, n \, \log \log n .$ The conjecture that, for $n > 5040,$ we have $ \sigma(n) < e^\gamma \, n \, \log \log n ,$ is equivalent to the Riemann Hypothesis.

Note that the occurrence of $\log \log n$ means that we cannot replace the natural logarithm by some other without changing the sense of the statement. We would not just be multiplying by a constant if we used a different logarithm.

Will Jagy
  • 139,541
24

The law of the iterated logarithm states that $$ \limsup_{n \to \infty} \frac{X_1+\cdots+X_n}{\sqrt{n \log\log n}} = \sqrt 2 $$ almost surely, where $X_1,\ldots,X_n$ are iid random variables with means zero and unit variances.

Cm7F7Bb
  • 17,364
20

Something I found some months back. Would be surprised if this hasn't been looked at before. No citations.

We say a set $S$ can express $n$ if it's possible to express $n$ as a potentially repeating sum of elements of $S$.

We say that a set $S$ is critical for $n$ if $S$ can express $n$ and no strict subset of $S$ can express $n$.

Let $u_n$ be the size of the largest subset of $\{1,2,\dotsc ,n\}$ that is critical for $n$. It's conjectured that $u_n$ grows like $\log_e n$.

Evidence: enter image description here

wlad
  • 8,185
18

Consider phase transition in the Erdős-Rényi model $G(n, p)$. We have

The property that $G(n, p)$ has diameter two has a sharp threshold at $p = \sqrt{\frac{2\ln n}{n}}$.

That is, if $p$ is smaller than $\sqrt{\frac{2\ln n}{n}}$, then the probability that the diameter of $G(n, p)$ is greater than $2$ goes to $1$ in the limit, as $n$ goes to $\infty$; if $p$ is greater than $\sqrt{\frac{2\ln n}{n}}$, then the probability that the diameter of $G(n, p)$ is smaller than or equal to $2$ goes to $1$ as $n$ goes to $\infty$.

Another similar conclusion is

The disappearance of isolated vertices in $G(n, p)$ has a sharp threshold at $p = \frac{\ln n}{n}$.

PSPACEhard
  • 10,283
17

How do you count connected labeled graphs on $n$ vertices?

Let's take the not-necessarily-connected case first. There are $\binom{n}{2}$ possible edges between the $n$ vertices, and for each you may include it or not. So there are $$2^\binom{n}{2}$$ possible graphs.

Now to count connected graphs, we need to do some "generatingfunctionology", to steal Wilf's term. Let $$f(x) = \sum_{n=0}^\infty 2^\binom{n}{2} \frac{x^n}{n!}$$ be the (formal) exponential generating function for labeled graphs. Then if $c_n$ is the number of connected graphs on $n$ vertices, we have

$$\sum_{n=1}^\infty c_n \frac{x^n}{n!} = \log f(x) = \log\sum_{n=0}^\infty 2^\binom{n}{2} \frac{x^n}{n!}.$$

This is astonishing the first time you see it, but it is very natural once you understand how exponentiation works on exponential generating functions.

Jair Taylor
  • 16,852
16

$$ \frac{d}{dx}\,(x^x) = x^x \ (\ln(x)+1) $$

mdcq
  • 1,658
16

$$\sum_{k=1}^{\infty} \frac{k \mod{j}}{k(k+1)} = \log{j}, \: \forall j \in \mathbb{N}$$

bloomers
  • 1,058
15

I found it quite remarkable that $$\int\frac{1}{x\log(x)\log(\log(x))}dx = \log(\log(\log|x|))$$ But more generally, if $\log^{\circ i}(x)$ means $\log\underbrace\cdots_{i\text{ times}}\log x$, then $$\int\frac{dx}{x\prod_{i=1}^n{\log^{\circ i}(x)}} = \log^{\circ n+1}|x|, n\in\mathbb{N}$$ Indeed, $${\mathrm d\over\mathrm dx}\frac{1}{\log\log\log\log|x|}=\frac{1}{x\log(x)\log\log(x)\log\log\log(x)}$$

Parcly Taxel
  • 103,344
edmz
  • 531
15

Here is one containing a lot of $\log$s.

Consider the standard multiplication table, but with rows and columns indexed by $1$ to $N$ instead of $1$ to $10$. The question is, how many distinct integers are there among these? Perhaps surprisingly, the answer is asymptotically less than $N^2$. Ford has shown that the answer is, asymptotically, $$\frac{N^2}{(\log N)^{c_1}(\log\log N)^{3/2}},$$ where $c_1=1-\frac{1+\log\log 2}{\log 2}$. Similarly, if we were to consider $k+1$ dimensional multiplication table (defined in the obvious manner) the number of distinct integers in it is $$\frac{N^{k+1}}{(\log N)^{c_k}(\log\log N)^{3/2}},c_k=\frac{\log(k+1)+k\log k-k\log\log(k+1)-k}{\log(k+1)}.$$

Wojowu
  • 26,600
14

Solve $x^n-x-1=0$ for various values of $n$ ($n\ge 2$). There will be one root greater than $1$ for each $n$. The asymptotic behavior of this root as $n$ increases without bound is given to two terms as:

$x=1+(\log 2)/n+o(1/n)$

Oscar Lanzi
  • 39,403
13

In Calculus I the student learns how to find antiderviatives of $x^n$ for all integers $n \ne -1$. They scratch their heads and scream

"Give me the antiderivative of the inversion function $1/x$!"

OK you say, here it is:

$\ln(t)=\int _{1}^{t}{\frac {1}{x}}\,dx$

CopyPasteIt
  • 11,366
  • I see now you already have it listed - might as well keep my answer since it goes as it in a different way... – CopyPasteIt Jul 03 '17 at 19:59
  • 5
    Note that $\displaystyle\int_1^xt^n\operatorname d!t=\frac{x^{n+1}-1}{n+1}$, and $\displaystyle\lim_{n\to-1}\frac{x^{n+1}-1}{n+1}=\ln x$. Thus, this result is a limiting case of the integral power rule. – Akiva Weinberger Jul 05 '17 at 19:25
  • But this is one of the possible definitions of the $\ln$ function, and hence not surprising. – 200_success Jul 24 '17 at 06:31
  • It might be surprising to a calculus student who knows about $f(x) = e^x$ and only that the log function is the inverted graph of $f$. In any event, it is fascinating that the 'puzzle pieces' fit together the way they do. – CopyPasteIt Jul 24 '17 at 10:54
12

Perhaps the students would enjoy that the area of the unit circle may be expressed as $$ - \sqrt{-1} \log{(-1)} $$

ekkilop
  • 2,278
8

This is more about $e$ than the natural logarithm, but I was surprised that the maximum of $x^{1/x}$ was at $e$.

That comes up in studying the equation $a^b = b^a$ for $a, b \in \mathbb{R}$ with $a\ne b$.

6

Boltzmann's entropy equation:

$$S = k\ln{W}$$

  • 4
    This is not specifically about natural logarithms. I know that the standard way of expressing Boltzmann's entropy equation uses them but, with a different constant, we would be ablle to express the equation using, say, common logarithms or base $2$ logarithms. – José Carlos Santos Jul 06 '17 at 10:46
  • True, but the constant is the Boltzmann constant which is $R/N_A$ and thus not very arbitrary. I find the relationships with those other constants to be cool. – Archie Gertsman Jul 06 '17 at 11:21
5

I know it's a late reply, but I am a mathematician working on a fish farm (completely misplaced :D ) and, since you're from Porto (and I studied in Aveiro), you deserve another great application for $\ln x$.

The Specific Growth Rate ($SGR$) for an time interval with $d$ days of a farmed species is:

$$SGR=100\times\frac{\ln w_f - \ln w_i}{d}$$

Where $w_i$ is the initial average weight of the population (or of a single individual animal), $w_f$ is the final weight (again, the average of the entire population or of just one specimen).

This has tremendous production optimization applications, since you can estimate the Feed Conversion Ratio ($FCR$) using $SGR$:

$$FCR=\frac{SFR}{SGR}$$

Where $SFR$ is the Specific Feeding Rate (quantity of food over biomass).

I work with this for several years now, and I still don't fully understand the meaning of $\ln$ on the formula for $SGR$. But it works!

Natural logarithms on biology and fish production... What a world...

Pspl
  • 566
4

My favorite facts involving logarithms are indeed law of the iterated logarithm, coupon collector's problem and giant component, but they seem to be already mentioned in previous answers.

However, there is a lot of cute facts about logarithms, that were not mentioned yet:

1) The logarithm is used to express relationships between uniform continuous, exponential and Pareto distributions:

If $X \sim U(0,1)$ then $- {\lambda}^{-1} \ln {X} \sim Exp(\lambda)$

If $X \sim P(\lambda, t)$, then $\ln{\frac{X}{t}} \sim Exp(\lambda)$

2) $\exists c \in \mathbb{R}, B_n \leq {(\frac{cn}{\ln {(n+1)}})}^n$, where $B_n$ is the n-th Bell number.

3) The smallest possible independence number in an n-vertex triangle-free graph is $O({(n \ln{n})}^{\frac{1}{2}})$

4) $\frac{2(\ln n)}{\ln(\frac{1}{p})}$ is the size of the largest clique in almost every random graph with $n$ vertices and edge probability $p$

5) The number of groups of order $n$ is less, than $n^{\frac{(\ln n)^2}{2}}$

6) Every finite group $G$, whose composition factors are isomorphic neither to Steinberg groups, nor to Suzuki groups, nor to Ree groups, has a presentation of order $O((\ln |G|)^3)$

7) If $G$ is a non-abelian simple group, $\exists S (\langle S \rangle = G) \cup (|S| \leq 7) \cup (diam(Cay(G, S)) \leq {10}^{10}\ln|G|)$

8) There exists such a constant $C$, such that every finite group G has less than $\frac{C \ln |G|}{(\ln \ln |G|)^8}$ conjugacy classes.

9) And there are also several inequalities for Ramsey numbers, that involve logarithms:

$$\exists c \in \mathbb{R} \forall n \in \mathbb{N} \text{ }R(n, n) \leq n^{\frac{-c \ln{n}}{\ln{\ln{n}}}} 4^n \text{(Conlon inequality)}$$

$$R(3, n) = O(\frac{n^2}{\ln {n}})$$

$$\exists \{c_n\}_{n=1}^{\infty} \subset \mathbb{R} \forall n \in \mathbb{N} \forall m \in \mathbb{N} \text{ } c_n \frac{m^{\frac{m+1}{2}}}{{(\ln{m})}^{\frac{n+1}{2} - \frac{1}{n-2}}} \leq R(n, m) \text{(Bohman-Keevash inequality)}$$

$$\exists \{c_n\}_{n=1}^{\infty} \subset \mathbb{R} \forall n \in \mathbb{N} \forall m \in \mathbb{N} \text{ } R(n, m) \leq c_n \frac{m^{n-1}}{{(\ln{m})}^{n-2}} \text{(Ajtai-Komlós-Szemerédi inequality)}$$

Chain Markov
  • 15,564
  • 6
  • 36
  • 116
4

As the asker pointed out, my last answer was not specific to the natural logarithm. So, here's a slightly more $\ln$-specific example:

$$\ln(2)=\frac1{1+\dfrac1{1+\dfrac{2^2}{1+\dfrac{3^2}{1+\dfrac{4^2}{\ddots}}}}}$$

Which converges quite slowly. Isn't that pretty?

4

Let $d(n)$ be the number of divisors of $n$. Then $\sum_{n=1}^xd(n)$ is asymptotic to $x\log x$. That is to say, $$\lim_{x\to\infty}{1\over x\log x}\sum_{n=1}^xd(n)=1$$ More is known: $$\sum_{n=1}^xd(n)=x\log x+(2\gamma-1)x+O(\sqrt x)$$ where $\gamma$ is Euler's constant.

Gerry Myerson
  • 179,216
4

One other that I found interesting:

If we have a unit square: $x,y\in[0,1]^2$ we can say that the average (expected) difference between two points in said square is: $$\int\limits_0^1\int\limits_0^1\int\limits_0^1\int\limits_0^1\sqrt{(x_1-x_2)^2+(y_1-y_2)^2}\,\text{d}x_1\text{d}x_2\text{d}y_1\text{d}y_2$$ Which is turns out is equal to: $$\frac{2+\sqrt{2}+5\ln(\sqrt{2}+1)}{15}$$

Source

Henry Lee
  • 12,215
3

The natural logarithm occurs often when analysing sorting and searching algorithms used in computer science. A famous example is the asymptotic formula of the average number of comparisons $Q_n$ of the Quick Sort algorithm. \begin{align*} \color{blue}{Q_n=2n(\ln n + \gamma -2)+2\ln n+2\gamma+1+O\left(\frac{1}{n}\right)} \end{align*}

Volume 3 of Knuth's classic The Art of Computer Programming is titled Sorting and Searching. It presents a wealth of applications of these two fundamental combinatorial themes and one gem is C.A.R. Hoare's Quicksort algorithm.

Quicksort is the standard sorting procedure in UNIX systems and has been cited as we can read in this paper by J.A. Fill as one of the ten algorithms with the greatest influence on the development and practice of science and engineering in the $20$th century.

Markus Scheuer
  • 108,315
1

Related to what you have given yourself but I always found it interesting: $$\int_1^x\frac{dt}t=\ln(x)-\ln(1)=\ln(x)$$ This is perhaps one of the main definitions of $\ln$ though. We also have that: $$\Re\left[\ln(x+jy)\right]=\frac12\ln(x^2+y^2)$$


logs are also often useful in approximating equations, e.g.: $$y=c x^n$$ $$\ln(y)=\ln(c)+n\ln(x)$$ so plotting $\ln(x)$ vs $\ln(y)$ allows us to find values for $c,n$

Henry Lee
  • 12,215