Let $\mathcal S_+^d$ be set of real $d \times d$ symmetric positive semidefinite matrices and $\mathcal S_{++}^d$ be set of real $d \times d$ symmetric positive definite matrices
Following section 1.3 of the paper Quantum Optimal Transport for Tensor Field Processing (arXiv link, published not-open access version here) for $P \in \mathcal S_+^d$ and $(Q_i)_{i \in I} \subset S_+^d$ define $$ \exp\left(P + \sum_{i \in I} \log(Q_i) \right) $$ to be the matrix in $S_+^d$ with kernel $\sum_{i \in I} \ker(Q_i)$ and which is unambiguously defined on the orthogonal of that space.
Just using this definition, we get the following: for $x \in \ker(P)^{\perp} = \text{ran}(P^{\mathsf t}) = \text{ran}(P)$ there exists a $y \in \mathbb R^d$ with $x = P y$. As $P$ is symmetric, we can orthogonally diagonalize $P = U^{\mathsf t} D U$, where $D := \text{diag}(\lambda_1, \ldots, \lambda_d)$. Hence \begin{align*} \exp(\log(P)) x & = \exp\big(0_{d \times d} + \log(P)\big) P y = \sum_{k = 0}^{\infty} \frac{1}{k!} \log(P)^k \cdot P y \\ & = \sum_{k = 0}^{\infty} \frac{1}{k!} U^{\mathsf t} \log(D)^k U \cdot U^{\mathsf t} D U y \\ & = U^{\mathsf t} \left(\sum_{k = 0}^{\infty} \frac{1}{k!} \log(D)^k D \right) U y = U^{\mathsf t} \text{diag}\left(\left(\sum_{k = 0}^{\infty} \frac{1}{k!} \log(\lambda_j)^k \lambda_j \right)_{j = 1}^{d} \right) U y \\ & = U^{\mathsf t} \text{diag}\left(\left(\lambda_j^2 \right)_{j = 1}^{d} \right) U y = U^{\mathsf t} D^2 U y = P^2 y = P x. \end{align*} Hence $\exp(\log(P)) = P$.
This implies moreover that $\exp\left(\frac{1}{2} \log(P)\right) = \sqrt{P}$ for $P \in \mathcal S_+^d$, since $$\exp\left(\frac{1}{2} \log(P)\right) \cdot \exp\left(\frac{1}{2} \log(P)\right) = \exp(\log(P)) = P.$$
My question.
In Example 2, for $P, Q \in S_+^d$ the "geometric mean"
$$
M(P, Q)
:= \exp\left(\frac{1}{2} \log(P) + \frac{1}{2} \log(Q)\right)
$$
is mentioned. (Side question: Is $M(P, Q) = P^{\frac{1}{2}} (P^{-\frac{1}{2}} Q P^{-\frac{1}{2}})^{\frac{1}{2}} P^{\frac{1}{2}}$, that is, is $M$ the "real" geometric mean?).
Now, as the matrix logarithm is not defined for singular matrices $P \in \mathcal S_{+}^d \setminus \mathcal S_{++}^d$, I don't know how to interpret this notation $M$.
Remark. We have $M(P, Q) = f(\sqrt{P}, \sqrt{Q})$ for the function $f(A, B) := \exp\big(\log(A) + \log(B)\big)$, which is discussed here. In particular $M(P, Q) = \sqrt{P Q}$ if $P Q = Q P$. By the Golden-Thompson inequality we have $\text{tr}\big(M(P, Q)\big) \le \text{tr}\big(\sqrt{P} \sqrt{Q}\big)$ for $P, Q \in \mathcal S_{++}^d$ and equality if and only if $P$ and $Q$ commute.
Another perspective: in the scalar case ($d = 1$), $M$ is, up to an additive constant of $-\ln(2)$, equal to the LogSumExp function.
Ideas. If $P, Q \in \mathcal S_{++}^d$ were positive definite, then (by the unicity of a PSD square root) we would have $\frac{1}{2} \log(P) = \log(\sqrt{P})$ and then $$ \exp\left(\frac{1}{2} \log(P) + \frac{1}{2} \log(Q)\right) = \exp\big( 0_{d \times d} + \log(\sqrt{P}) + \log(\sqrt{Q})\big) $$ could be interpreted in the above sense as the matrix in $\mathcal S_+^d$ with kernel $\ker(\sqrt{P}) + \ker(\sqrt{Q})$ and who, on $\big( \ker(\sqrt{P}) + \ker(\sqrt{Q}) \big)^{\perp} = \ker(\sqrt{P})^{\perp} \cap \ker(\sqrt{Q})^{\perp}$, is equal to $\exp(0_{d \times d}) = \text{id}_{d \times d}$ and thus somehow an orthogonal projector onto $\big( \ker(\sqrt{P}) + \ker(\sqrt{Q}) \big)^{\perp}$. For singular matrices $P, Q \in \mathcal S_+^d \setminus \mathcal S_{++}^d$ we could still define $M$ to be the orthogonal projector onto $\big( \ker(\sqrt{P}) + \ker(\sqrt{Q}) \big)^{\perp}$, but I don't know if this is a sensible definition.
Response to the answer by Igor Rivin. Is the following limit what you had in mind and is there a way to evaluate it? $$ \lim_{\varepsilon \searrow 0} \exp\left( \frac{1}{2} \log(P + \varepsilon I) + \frac{1}{2} \log(Q + \varepsilon I)\right) = \exp\left( \lim_{\varepsilon \searrow 0} \frac{1}{2} \log(P + \varepsilon I) + \frac{1}{2} \log(Q + \varepsilon I)\right), $$ where the equality is due to the continuity of the matrix exponential.