Problem
Detailed instructions for computing the SVD abound on Math Stack Exchange. For example: SVD and the columns, SVD -obligation of normalization, how SVD is calculated in reality, Pseudo-inverse of a matrix that is neither fat nor tall?.
This problem is straightforward. Let the target matrix be the covector
$$
\mathbf{A} = \left[ \begin{array}{ccr}
2 & 1 & -2
\end{array} \right]
$$
Find the singular value decomposition
$$
\mathbf{A} = \mathbf{U} \, \Sigma \, \mathbf{V}^{*}
$$
Methods
The first link presents a table showing two paths for finding the SVD for a matrix $\mathbf{A} \in \mathbb{C}^{m \times n}_{\rho}$:
$$
\begin{array}{lll}
%
\text{Operation} &
\text{Row space first} & \text{Column space first} \\\hline
%
\text{1. Construct product matrix} &
\mathbf{W} = \mathbf{A}^{*} \mathbf{A} &
\mathbf{W} = \mathbf{A} \, \mathbf{A}^{*} \\
%
\text{2. Solve for eigenvalues} &
\sigma = \tilde{\lambda} \left( \mathbf{W} \right) &
\sigma = \tilde{\lambda} \left( \mathbf{W} \right) \\
%
\color{blue}{\text{3. Solve for eigenvectors }} w_{k},\ k=1, \rho &
\left( \mathbf{W} - \lambda_{k} \mathbf{I}_{n} \right) w_{k} = \mathbf{0} &
\left( \mathbf{W} - \lambda_{k} \mathbf{I}_{m} \right) w_{k} = \mathbf{0} \\
%
\text{4. Assemble domain matrix} &
\mathbf{V}_{k} = \frac{w_{k}}{\lVert w_{k} \rVert_{2}} &
\mathbf{U}_{k} = \frac{w_{k}}{\lVert w_{k} \rVert_{2}} &
\\
%
\text{5. Compute complementary domain matrix} &
\mathbf{U}_{k} = \sigma_{k}^{-1} \mathbf{A} \mathbf{V}_{k} &
\mathbf{V}_{k} = \sigma_{k}^{-1} \mathbf{A}^{*} \mathbf{U}_{k} &
\\
%
\end{array}
$$
The two product matrices to wrok with are
$$
\mathbf{A} \mathbf{A}^{*} =
\left[
\begin{array}{c}
9 \\
\end{array}
\right]
\qquad
\mathbf{A}^{*} \mathbf{A} =
\left[
\begin{array}{rrr}
4 & 2 & -4 \\
2 & 1 & -2 \\
-4 & -2 & 4 \\
\end{array}
\right].
$$
Solution
The product matrix $\mathbf{A} \mathbf{A}^{*}$ is much easier to work with!
Singular values
The eigenvalue is
$$
\lambda \left( \mathbf{A}\mathbf{A}^{*} \right) = 9
$$
which implies the singular value is
$$
\sigma = \sqrt{\lambda \left( \mathbf{A}\mathbf{A}^{*} \right)} = 3
$$
The matrix of singular values is
$$
\mathbf{S} =
\left[
\begin{array}{c}
3
\end{array}
\right]
$$
There is no need for $0$ padding, so
$$
\Sigma = \mathbf{S}
$$
Matrix $\mathbf{U}$
The eigenvector is already normalized and represents the first column
$$
\mathbf{U} =
\color{blue}{\left[
\begin{array}{c}
1
\end{array}
\right]}
$$
Coloring distinguishes $\color{blue}{range}$ space vectors from $\color{red}{null}$ space vectors.
Column space
$$
\begin{align}
\color{blue}{\mathcal{R}\left(\mathbf{A}\right)}
&=
\text{span} \left\{ \,
\color{blue}{\left[
\begin{array}{c}
1
\end{array}
\right]}
\, \right\} \\
%
\color{red}{\mathcal{N}\left(\mathbf{A}^{*}\right)} &=
\left\{ 0 \right\}
\end{align}
$$
Matrix $\mathbf{V}$
$$
\mathbf{V}_{1}
= \frac{1}{\sigma} \mathbf{A}^{*} \mathbf{U}_{k}
= \frac{1}{3}
\color{blue}{\left[
\begin{array}{r}
2 \\ 1 \\ -2
\end{array}
\right]}
$$
For the $\color{red}{null}$ space vectors, we have options. We could call upon the process of Gram and Schmidt. Or we could eyeball the problem to find any $\color{red}{null}$ space vector, and use the cross product to find the missing vector.
Choosing the latter, we select
$$
\color{red}{\mathbf{V}_{2}} = \frac{1}{\sqrt{2}}
\color{red}{\left[
\begin{array}{r}
1 \\ 0 \\ 1
\end{array}
\right]}
$$
The missing vector is then
$$
\color{blue}{\left[
\begin{array}{r}
2 \\ 1 \\ -2
\end{array}
\right]}
\times
\color{red}{\left[
\begin{array}{r}
1 \\ 0 \\ 1
\end{array}
\right]}
=
\color{red}{\left[
\begin{array}{r}
1 \\ -4 \\ -1
\end{array}
\right]}
$$
Final assembly
$$
\begin{align}
\mathbf{A} &= \mathbf{U} \, \Sigma \mathbf{V}^{*}
% U
\\ &=
\color{blue}{\left[
\begin{array}{c}
1
\end{array}
\right]}
% Sigma
\left[
\begin{array}{c}
3
\end{array}
\right]
% V*
\left[
\begin{array}{ccc}
% c1
\frac{1}{3}
\color{blue}{\left[
\begin{array}{r}
2 \\ 1 \\ -2
\end{array}
\right]}
% c2
\frac{1}{\sqrt{2}}
\color{red}{\left[
\begin{array}{r}
1 \\ 0 \\ 1
\end{array}
\right]}
% c3
\frac{1}{3\sqrt{2}}
\color{red}{\left[
\begin{array}{r}
1 \\ -4 \\ -1
\end{array}
\right]}
\end{array}
\right]
%
\\ &=
\left[ \begin{array}{ccr}
2 & 1 & -2
\end{array} \right]
%
\end{align}
$$