I came up with a different solution, using just elementary calculus results. Suppose that we have a (not necessarily orthonormal) basis $\{v_1,\dots,v_n\}$ for $\mathbb{R}^n$. We will design an orthogonal basis $\{b_1,\dots,b_n\}$ from this basis by using a recursive process akin to the Gram-Schmidt process.
Let $b_i\in\operatorname{Span}\{v_1,\dots,v_{i-1}\}^\perp\cap\operatorname{Span}\{v_1,\dots,v_i\}$. In other words, there exists $\beta_1,\dots,\beta_i\in\mathbb{R}$ such that
$$
b_i=\sum_{j=1}^i\beta_jv_j
$$
Note we can write the following system of $(i-1)$ equations:
$$
\begin{pmatrix}
\sum_{j=1}^i\beta_j(v_j\cdot v_1)\\
\vdots\\
\sum_{j=1}^i\beta_j(v_j\cdot v_{i-1})
\end{pmatrix}=
\begin{pmatrix}
0\\\vdots\\0
\end{pmatrix}
$$
Now, if we add just one more (trivial) equation, $\beta_i=\beta_i$, then we can write a system of $i$ equations with $i$ unknowns:
$$
\begin{pmatrix}
v_1\cdot v_1&\dots&v_i\cdot v_1\\
\vdots&\ddots&\vdots\\
v_1\cdot v_{i-1}&\dots&v_i\cdot v_{i-1}\\
0&\dots&1
\end{pmatrix}
\begin{pmatrix}
\beta_1\\\vdots\\\beta_i
\end{pmatrix}
=
\begin{pmatrix}
0\\\vdots\\0\\\beta_i
\end{pmatrix}
$$
Let the matrix be the matrix $B$, and using Cramer's rule, we can claim
$$
\det(B)\beta_j=\left\vert
\begin{matrix}
v_1\cdot v_1&\dots&v_{j-1}\cdot v_1&0&v_{j+1}\cdot v_1&\dots&v_i\cdot v_1\\
\vdots&\ddots&\vdots&\vdots&\vdots&\ddots&\vdots\\
v_1\cdot v_{i-1}&\dots&v_{j-1}\cdot v_{i-1}&0&v_{j+1}\cdot v_{i-1}&\dots&v_i\cdot v_{i-1}\\
0&\dots&0&\beta_i&0&\dots&1
\end{matrix}
\right\vert
$$
Notice that we can simplify this a bit to obtain
$$
\det(B)\beta_j=\left\vert
\begin{matrix}
v_1\cdot v_1&\dots&v_{j-1}\cdot v_1&0&v_{j+1}\cdot v_1&\dots&v_i\cdot v_1\\
\vdots&\ddots&\vdots&\vdots&\vdots&\ddots&\vdots\\
v_1\cdot v_{i-1}&\dots&v_{j-1}\cdot v_{i-1}&0&v_{j+1}\cdot v_{i-1}&\dots&v_i\cdot v_{i-1}\\
0&\dots&0&1&0&\dots&1
\end{matrix}
\right\vert\beta_i
$$
If $\det(B)=\beta_i$, it is easy to confirm that
$$
\beta_j=(-1)^{i+j}\det(v_k\cdot v_l),~k\in\{1,\dots,i-1\},~l\in\{1,\dots,i\}\setminus\{j\}
$$
It is easy to then confirm that we can now say
$$
b_i=\sum_{j=1}^i(-1)^{i+j}\det(v_k\cdot v_l)
$$
where $k$ and $l$ are indexed by the same $k$ and $l$ as above. So, we can abuse a bit of notation and write
$$
b_i=\left\vert
\begin{matrix}
v_1\cdot v_1&\dots&v_i\cdot v_1\\
\vdots&\ddots&\vdots\\
v_1\cdot v_{i-1}&\dots&v_i\cdot v_{i-1}\\
v_1&\dots&v_i
\end{matrix}
\right\vert
$$
And, it follows that $\{b_1,\dots,b_n\}$ is an orthonormal basis for $\mathbb{R}^n$.
Now, we will use the fact that
$$
\int_{[0,1]^n}d^nx=(1)^n=1
$$
to find
$$
\operatorname{Vol}(v_1,\dots,v_n)\equiv\int_{[0,v_1]\times\dots\times[0,v_n]}\prod_{i=1}^nd(x\cdot v_i)
$$
Let $e_i=b_i/\left\vert b_i\right\vert$, where $\{b_1,\dots,b_n\}$ is the orthogonal basis found using the process used before. Notice that
$$
d^nk=\prod_{i=1}^nd(k\cdot e_i)
$$
This is true because $\{e_i\}$ is related to all other orthonormal bases by linear plane isometries, which all have Jacobian of $1$. So, by simple rescaling, we get
$$
d^nk=\prod_{i=1}^n\frac{d(k\cdot b_i)}{\left\vert b_i\right\vert}
$$
We can write that
$$
b_i^2=
\left\vert
\begin{matrix}
v_1\cdot v_1&\dots&v_i\cdot v_1\\
\vdots&\ddots&\vdots\\
v_1\cdot v_{i-1}&\dots&v_i\cdot v_{i-1}\\
b_i\cdot v_1&\dots&b_i\cdot v_i
\end{matrix}
\right\vert
$$
Because by construction, $b_i\in\operatorname{Span}(v_1,\dots,v_{i-1})^\perp$, $b_i\cdot v_j=0$ for $j<i$. Now, notice that
$$
b_i\cdot v_i=
\left\vert
\begin{matrix}
v_1\cdot v_1&\dots&v_i\cdot v_1\\
\vdots&\ddots&\vdots\\
v_1\cdot v_{i-1}&\dots&v_i\cdot v_{i-1}\\
v_i\cdot v_1&\dots&v_i\cdot v_i
\end{matrix}
\right\vert
$$
Or, in other words, $b_i=G(v_1,\dots,v_i)$, where $G$ is the Gramian. This means
$$
b_i^2=
\left\vert
\begin{matrix}
v_1\cdot v_1&\dots&v_i\cdot v_1\\
\vdots&\ddots&\vdots\\
v_1\cdot v_{i-1}&\dots&v_i\cdot v_{i-1}\\
0&\dots&G(v_1,\dots,v_i)
\end{matrix}
\right\vert
$$
Or, in other words,
$$
b_i^2=
G(v_1,\dots,v_i)G(v_1,\dots,v_{i-1})
$$
I will now define $G_i\equiv G(v_1,\dots,v_i)$. This means that
$$
d^nk=\prod_{i=1}^n\frac{d(k\cdot b_i)}{\sqrt{G_{i-1}G_i}}
$$
Notice that by reordering this product, we can simply write
$$
d^nk=\sqrt{\frac{G_0}{G_n}}\prod_{i=1}^n\frac{d(k\cdot b_i)}{\sqrt{G_{i-1}^2}}=G_n^{-1/2}\prod_{i=1}^nd(k\cdot b_i)
$$
Now, notice that
$$
\frac{\partial(k\cdot b_i)}{\partial(k\cdot v_j)}=0
$$
if $j>i$ because then $k\cdot b_i$ is independent of $k\cdot v_j$. This means that our Jacobian matrix is upper triangular. So, this means that the Jacobian determinant is just going to be
$$
\det(J)=\prod_{i=1}^n\frac{\partial(k\cdot b_i)}{\partial(k\cdot v_i)}
$$
Notice that $k\cdot b_i=(k\cdot v_i)G_{i-1}+C$, where $C$ is a function that does not depend on $k\cdot v_i$, so we can simply write
$$
\det(J)=\prod_{i=1}^nG_{i-1}
$$
So, we can finally write that
$$
d^nk=G_n^{-1/2}\prod_{i=1}^n\frac{G_{i-1}}{G_{i-1}}d(v_i\cdot k)
$$
Or, in other words,
$$
\int_{[0,1]^n}d^nk=G^{-1/2}\int_{[0,v_1]\times\dots\times[0,v_n]}\prod_{i=1}^nd(v_i\cdot k)
$$
And, finally,
$$
\operatorname{Vol}(v_1,\dots,v_n)=G(v_1,\dots,v_n)^{1/2}
$$
I know the proof can be made shorter, but this was what was most intuitive for me!