$\def\d{\mathrm{d}}\def\R{\mathbb{R}}\def\x{\boldsymbol{x}}\def\0{\mathbf{0}}\def\abs#1{\left|#1\right|}\def\paren#1{\left(#1\right)}\def\C{\paren{\sum\limits_{\smash{j = 1}}^n C_j^2}^{\frac{1}{2}}}$This partial answer tries to show the complexity of the problem for general $n$ by deriving the formula for the content of an $(n - 1)$-dimensional body formed by $n$ points in the $n$-dimensional space.
For $\x_1, \cdots, \x_n \in \R^n$ with a general configuration, suppose $\x_i = (x_{i, 1}, \cdots, x_{i, n})$ for all $i$ and define$$
X = \begin{bmatrix}
\x_1 \\ \vdots \\ \x_n
\end{bmatrix} = \begin{bmatrix}
x_{1, 1} & \cdots & x_{1, n}\\
\vdots & \ddots & \vdots\\
x_{n, 1} & \cdots & x_{n, n}
\end{bmatrix}.
$$
It is well-known that the content of the $n$-dimensional body formed by $\0, \x_1, \cdots, \x_n$ is $V = \dfrac{1}{n} |\det X|$, and the equation of the $(n - 1)$-dimensional hyperplane passing through $\x_1, \cdots, \x_n$ is\begin{gather*}
\det\begin{bmatrix}
\x - \x_1 \\ \x_2 - \x_1 \\ \vdots \\ \x_n - \x_1
\end{bmatrix} = \begin{vmatrix}
x_1 - x_{1, 1} & x_2 - x_{1, 2} & \cdots & x_n - x_{1, n}\\
x_{2, 1} - x_{1, 1} & x_{2, 2} - x_{1, 2} & \cdots & x_{2, n} - x_{1, n}\\
\vdots & \vdots & \ddots & \vdots\\
x_{n, 1} - x_{1, 1} & x_{n, 2} - x_{1, 2} & \cdots & x_{n, n} - x_{1, n}
\end{vmatrix} = 0. \tag{1}
\end{gather*}
Denoting the coefficient of $x_j$ in (1) by $C_j$ for each $j$, then the distance between $\0$ and the hyperplane is known to be$$
d = \frac{1}{\C} \abs{ \det\begin{bmatrix}
\0 - \x_1 \\ \x_2 - \x_1 \\ \vdots \\ \x_n - \x_1
\end{bmatrix} } = \frac{1}{\C} \abs{ \det\begin{bmatrix}
-\x_1 \\ \x_2 \\ \vdots \\ \x_n
\end{bmatrix} } = \frac{|\det X|}{\C},
$$
which implies that the content of the $(n - 1)$-dimensional body formed by $\x_1, \cdots, \x_n$ is$$
m = \frac{nV}{d} = \C.
$$
To simplify the expression of $C_j$'s, note that\begin{align*}
C_1 &= \begin{vmatrix}
x_{2, 2} - x_{1, 2} & \cdots & x_{2, n} - x_{1, n}\\
\vdots & \ddots & \vdots\\
x_{n, 2} - x_{1, 2} & \cdots & x_{n, n} - x_{1, n}
\end{vmatrix}\\
&= \begin{vmatrix}
1 & x_{1, 2} & \cdots & x_{1, n}\\
0 & x_{2, 2} - x_{1, 2} & \cdots & x_{2, n} - x_{1, n}\\
\vdots & \vdots & \ddots & \vdots\\
0 & x_{n, 2} - x_{1, 2} & \cdots & x_{n, n} - x_{1, n}
\end{vmatrix}\\
&= \begin{vmatrix}
1 & x_{1, 2} & \cdots & x_{1, n}\\
1 & x_{2, 2} & \cdots & x_{2, n}\\
\vdots & \vdots & \ddots & \vdots\\
1 & x_{n, 2} & \cdots & x_{n, n}
\end{vmatrix} = \sum_{i = 1}^n A_{i, 1},
\end{align*}
where $A_{i, j}$ is the $(i, j)$-th cofactor of $X$. It can be derived analogously (although with more complex notations) that $C_j = \sum\limits_{i = 1}^n A_{i, j}$ for all $j$, thus$$
d = \C = \paren{ \sum_{j = 1}^n \paren{ \sum_{i = 1}^n A_{i, j} }^2 }^{\frac{1}{2}}.
$$
Therefore, the expectation to be computed is$$
\frac{1}{B_n^n} \mathop{\intop\cdots\intop}\limits_{\|\x_1\|, \cdots, \|\x_n\| \leqslant 1} \paren{ \sum_{j = 1}^n \paren{ \sum_{i = 1}^n A_{i, j}(\x_1, \cdots, \x_n) }^2 }^{\frac{1}{2}} \,\d\x_1\cdots\d\x_n,
$$
where $B_n = \dfrac{π^{\frac{n}{2}}}{Γ\paren{ \frac{n}{2} + 1 }}$ is the content of the unit $n$-ball, and $\d\x_i = \d x_{i, 1}\cdots\d x_{i, n}$ for each $i$.