Currently I'm reading a little about regularization of distributions which usually have annoying singularities like $x^{-1}$ or $|x|^{-1}$. To the best of my understanding, $$ \frac{1}{x} $$ is interpreted as a distribution by considering it's Cauchy principal value instead of doing the bruteforce integration. This way, it's possible to recover a meaningful result by carefully cancelling contributions to the singularity. So if $g$ is, say, a function from Schwartz space, then $$ \left\langle \text{p.v.}\, \frac{1}{x}, \, g(x) \right\rangle = \int _{0}^{\infty} \frac{g(x) - g(-x)}{x} \, dx. $$ Intuitively, this makes total sense to me. However, when reading about $|x|^{-1}$ I get slightly confused. The definition states that $|x|^{-1}$ (or, rather, it's regularized cousin $\mathcal{P}|x|^{-1}$) acts on test functions as per $$ \left\langle \mathcal{P}\frac{1}{|x|}, \, g(x) \right\rangle = \int _{|x|<1} \frac{g(x)-g(0)}{|x|} \, dx + \int_{|x|>1} \frac{g(x)}{|x|} \, dx $$ The second integral is pretty straightforward; no singularities to sneak around. But the first one... Does it solve the singularity problem? Yes, I can see that, since $$ \frac{g(x)-g(0)}{|x|} $$ remains bounded as $x \rightarrow0$. But how do we get to it? What's the reasoning behind it? I've been struggling to see it for a little while. Any and all help will be appreciated :).
2 Answers
Since $u(x):=\operatorname{sign}(x)\ln|x| \in L^{1}_{\text{loc}}(\mathbb{R})$ we can take $$ \frac{1}{|x|}:=u'(x), $$ i.e. for $\varphi\in C_c^\infty(\mathbb{R})$ we set $$\begin{align} \left< \frac{1}{|x|}, \varphi(x) \right> &:= - \int_{-\infty}^{\infty} \operatorname{sign}(x)\ln|x| \, \varphi'(x) \, dx \\&= \lim_{\epsilon\to 0} \left( \int_{-\infty}^{-\epsilon} \ln|x|\,\varphi'(x)\,dx - \int_{\epsilon}^{\infty} \ln|x|\,\varphi'(x)\,dx \right) \\&= \lim_{\epsilon\to 0} \left( \left[\ln|x|\,\varphi(x)\right]_{-\infty}^{-\epsilon} - \int_{-\infty}^{-\epsilon} \frac{1}{x} \varphi(x)\,dx - \left[\ln|x|\,\varphi(x)\right]_{\epsilon}^{\infty} + \int_{\epsilon}^{\infty} \frac{1}{x}\,\varphi(x)\,dx \right) \\&= \lim_{\epsilon\to 0} \left( \ln\epsilon\,\varphi(-\epsilon) - \int_{-\infty}^{-\epsilon} \frac{1}{x} \varphi(x)\,dx + \ln\epsilon\,\varphi(\epsilon) + \int_{\epsilon}^{\infty} \frac{1}{x}\,\varphi(x)\,dx \right) \\&= \lim_{\epsilon\to 0} \left( \ln\epsilon\,(\varphi(-\epsilon)+\varphi(\epsilon)) + \int_{|x|>\epsilon} \frac{1}{|x|}\,\varphi(x)\,dx \right) \\&= \lim_{\epsilon\to 0} \left( 2\ln\epsilon\,\varphi(0) + \int_{|x|>\epsilon} \frac{1}{|x|}\,\varphi(x)\,dx \right) .\end{align}$$

- 26,770
Let $U,V$ be two open sets with $U\subset V$. One has a restriction map for distributions $$ r:\mathscr{D}'(V)\rightarrow\mathscr{D}'(U) $$ which is defined by $r(T)(f)=T(\tilde{f})$ where $T$ is a distribution on $V$, $f$ is a smooth compactly supported function on $U$ and $\tilde{f}$ is its extension by zero to all of $V$. Now given a distribution $S\in \mathscr{D}'(U)$ one can ask for a characterization of all extensions $T$ to $V$, i.e., find $T$'s such that $S=r(T)$.
Now take $V=\mathbb{R}$ and $U=(-\infty,0)\cup(0,\infty)$, and $S$ given by integration times $\frac{1}{|x|}$. The construction in the question is typically how you produce a $T$ which works. To get all other, you just add a distribution supported at the origin like the Dirac Delta or a linear combination of its derivatives.
To single out the construction in the question one can also bring into the game the notion of homogeneity. Thinking of $T$ as "function" $T(x)$, homogeneity of exponent $\alpha$ means $$ T(\lambda x)=\lambda^{\alpha}T(x) $$ for all nonzero $\lambda$ (the usual definition takes $\lambda>0$ but never mind). In other words, for every $\lambda$, $T$ is an eigenvector with eigenvalue $\lambda^{\alpha}$ for the rescaling operator $T(x)\rightarrow T(\lambda x)$. The $T$ in the question is not homogeneous, since up to scale the only homogeneous distribution of exponent $-1$ is the Dirac Delta at the origin. But $T$ is the next best thing, i.e., a generalized eigenvector, in the sense of Jordan reduction. The key word for this theory is associated homogeneous distributions.

- 4,731
So we would define
$${\displaystyle \operatorname {p.!v.} (\tfrac{1}{|x|})=\lim {\varepsilon \to 0}\int _{\mathbb {R} ^{n}\setminus B{\varepsilon}(0)}\frac{f(x)}{|x|},\mathrm {d} x.}$$
– Hyperplane Nov 27 '20 at 18:45