Define matrix
Start with a matrix
$$
\mathbf{A} \in\mathbb{C}^{m\times n}_{\rho}
$$
Fundamental Theorem of Linear Algebra
The Fundamental Theorem of Linear Algebra can be expressed as
$$
\begin{align}
%
\mathbf{C}^{n} =
\color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)} \oplus
\color{red}{\mathcal{N} \left( \mathbf{A} \right)} \\
%
\mathbf{C}^{m} =
\color{blue}{\mathcal{R} \left( \mathbf{A} \right)} \oplus
\color{red} {\mathcal{N} \left( \mathbf{A}^{*} \right)}
%
\end{align}
$$
Singular value decomposition
The singular value decomposition of the matrix is
$$
\begin{align}
\mathbf{A} &=
\mathbf{U} \, \Sigma \, \mathbf{V}^{*} \\
%
&=
% U
\left[ \begin{array}{cc}
\color{blue}{\mathbf{U}_{\mathcal{R}}} & \color{red}{\mathbf{U}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{cccc|cc}
\sigma_{1} & 0 & \dots & & & \dots & 0 \\
0 & \sigma_{2} \\
\vdots && \ddots \\
& & & \sigma_{\rho} \\\hline
& & & & 0 & \\
\vdots &&&&&\ddots \\
0 & & & & & & 0 \\
\end{array} \right]
% V
\left[ \begin{array}{c}
\color{blue}{\mathbf{V}_{\mathcal{R}}}^{*} \\
\color{red}{\mathbf{V}_{\mathcal{N}}}^{*}
\end{array} \right] \\
%
& =
% U
\left[ \begin{array}{cccccccc}
\color{blue}{u_{1}} & \dots & \color{blue}{u_{\rho}} & \color{red}{u_{\rho+1}} & \dots & \color{red}{u_{n}}
\end{array} \right]
% Sigma
\left[ \begin{array}{cc}
\mathbf{S}_{\rho\times \rho} & \mathbf{0} \\
\mathbf{0} & \mathbf{0}
\end{array} \right]
% V
\left[ \begin{array}{c}
\color{blue}{v_{1}^{*}} \\
\vdots \\
\color{blue}{v_{\rho}^{*}} \\
\color{red}{v_{\rho+1}^{*}} \\
\vdots \\
\color{red}{v_{n}^{*}}
\end{array} \right]
%
\end{align}
$$
The connection to the Fundamental Theorem is intimate:
$$
\begin{array}{ll}
%
column \ vectors & span \\\hline
%
\color{blue}{u_{1}} \dots \color{blue}{u_{\rho}} &
\color{blue}{\mathcal{R} \left( \mathbf{A} \right)} \\
%
\color{blue}{v_{1}} \dots \color{blue}{v_{\rho}} &
\color{blue}{\mathcal{R} \left( \mathbf{A}^{*} \right)} \\
%
\color{red}{u_{\rho+1}} \dots \color{red}{u_{m}} &
\color{red}{\mathcal{N} \left( \mathbf{A}^{*} \right)} \\
%
\color{red}{v_{\rho+1}} \dots \color{red}{v_{n}} &
\color{red}{\mathcal{N} \left( \mathbf{A} \right)} \\
%
\end{array}
$$
Pseudoinverse matrix
The least squares solution with the SVD produces the pseudoinverse:
$$
\begin{align}
\mathbf{A}^{+} &= \mathbf{V} \, \Sigma^{+} \mathbf{U}^{*} \\
%
&=
% V
\left[ \begin{array}{cc}
\color{blue}{\mathbf{V}_{\mathcal{R}}} &
\color{red}{\mathbf{V}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{cc}
\mathbf{S}^{-1} & \mathbf{0} \\
\mathbf{0} & \mathbf{0}
\end{array} \right]
%
% U
\left[ \begin{array}{c}
\color{blue}{\mathbf{U}_{\mathcal{R}}}^{*} \\
\color{red}{\mathbf{U}_{\mathcal{N}}}^{*}
\end{array} \right] \\
%
\end{align}
%
$$
$$
\mathbf{A}^{+} \in\mathbb{C}^{n\times m}_{\rho}
$$
The subspace decomposition in terms of the pseudoinverse is now explicit.
Adjoint matrix
For comparison, the adjoint matrix is
$$
\begin{align}
\mathbf{A}^{*} &= \mathbf{V} \, \Sigma^{T} \mathbf{U}^{*} \\
%
&=
% V
\left[ \begin{array}{cc}
\color{blue}{\mathbf{V}_{\mathcal{R}}} &
\color{red}{\mathbf{V}_{\mathcal{N}}}
\end{array} \right]
% Sigma
\left[ \begin{array}{cc}
\mathbf{S} & \mathbf{0} \\
\mathbf{0} & \mathbf{0}
\end{array} \right]
%
% U
\left[ \begin{array}{c}
\color{blue}{\mathbf{U}_{\mathcal{R}}}^{*} \\
\color{red}{\mathbf{U}_{\mathcal{N}}}^{*}
\end{array} \right] \\
%
\end{align}
$$
$$
\mathbf{A}^{*} \in\mathbb{C}^{n\times m}_{\rho}
$$
Further reading
How the pseudoinverse solution arises in least squares: How does the SVD solve the least squares problem?; Singular value decomposition proof
Variant forms of the pseudoinverse are presented in What forms does the Moore-Penrose inverse take under systems with full rank, full column rank, and full row rank?; generalized inverse of a matrix and convergence for singular matrix Note the null space relationships with the pseudoinverse.
Fundamental projectors and the pseudoinverse: Least squares solutions and the orthogonal projector onto the column space