The necessary condition you stated is also sufficient. Indeed, suppose
$A\in SL_n({\mathbb R})$ is diagonalizable over $\mathbb C$ and all that its
eigenvalues have modulus $1$.
We need the following :
Kernel invariance property Let $L/K$ be any extension of fields.
Let $B$ be a $n\times n$ matrix with coefficients in $K$. Then there is a basis
in $K^n$ of ${\sf Ker}_{K^n}(B)$, that can also serve as
a basis of ${\sf Ker}_{L^n}(B)$ (in particular the two kernels have the same
dimension).
This invariance property follows form Gauss’ method to
reduce to row echelon form.
The computations will be identical in $K^n$ and in $L^n$. And
as this method only involves additions, substractions and multiplications,
if the coefficients of $B$ are in $K$, all the computations will stay in $K$.
Denote the real eigenvalues of $A$ by $\lambda_1,\lambda_2, \ldots ,\lambda_r$
and $\alpha_1\pm i\beta_1, \alpha_2\pm i\beta_2, \ldots, \alpha_s\pm i\beta_s$.
Thus each $\lambda_j$ is $\pm 1$ and each $\alpha_j^2+\beta_j^2$ is $1$.
Since $A$ is diagonalizable over $\mathbb C$, there is a basis
$(b_1,b_2,\ldots ,b_n)$ of ${\mathbb C}^n$ with
$Ab_{2j-1}=(\alpha_j-i\beta_j)b_{2j-1}$ and
$Ab_{2j}=(\alpha_j+i\beta_j)b_{2j-1}$ for $1\leq j\leq s$,
and $Ab_k=\lambda_kb_k$ for $2s < k \leq n$. For every $j$ we can write $b_{2j}=x_j+iy_j$ where $x_j$ and $y_j$ have real
coordinates. As $A$ has real coefficients, we see that for every $j$ we have
$$
Ax_j=\alpha_j x_j-\beta_j y_j, Ay_j=\beta_j x_j+\alpha_j y_j \tag{1}
$$
By the kernel invariance property (consider $B=A-\lambda I$ for $\lambda$ an
eigenvalue), we can assume without loss that the $b_k (2s < k\leq n)$ have real coefficients.
Let $\alpha+\beta i$ be a complex eigenvalue of $A$. We shall denote by $K_{\mathbb R}$ ($K_{\mathbb C}$) the kernel of $(A-\alpha I)^2+\beta^2$ in
${\mathbb R}^n$ (${\mathbb C}^n$). By construction, $K_{\mathbb R}$ and
$K_{\mathbb C}$ are invariant by $A$. Let $W$ be a subspace of
$K_{\mathbb R}$ satisfing $W \cap AW=\lbrace 0 \rbrace$. Let $V=W \oplus AW$. Then
$V$ is invariant by $A$ also. Suppose that $V \neq K_{\mathbb R}$ ; then, we have a vector $x$ in $K_{\mathbb R}$ but not in $V$. Let $V'={\sf span}(x)\oplus V$. I claim that $Ax\not\in V'$. For otherwise, we would have some $\gamma\in{\mathbb R}$
such that $t_1=Ax-\gamma x \in V$. Then $t_2=A(Ax-\gamma x) \in V$, but
$t_2=A(Ax-\gamma x)=(2\alpha)Ax-x-\gamma Ax$. Then $t_3=t_2+(\gamma-2\alpha)t_1 \in V$, but $t_3=-((\gamma-\alpha)^2+\beta^2)x$. That would imply $x\in V$, a contradiction.
All this shows that $W'=W\oplus {\sf span}(x)$ still satisfies $W' \cap AW'=\lbrace 0 \rbrace$. So if we take a $W$ of maximal dimension with this property, we must have
$K_{\mathbb R}=W \oplus AW$. Let $(x^1,x^2,\ldots ,x^t)$ be a basis (in ${\mathbb R}^n$)
of $W$. Put $y^i=\frac{\alpha x-Ax}{\beta}$. Then $(y^1,y^2,\ldots ,y^t)$ forms
a basis for $AW$ in ${\mathbb R}^n$, and
$(x^1\pm i y^1,x^2\pm i y^2, \ldots, x^t\pm i y^t)$ forms a basis of
$K_{\mathbb C}$. So, in (1) we may assume without loss that the family
$(x_j,y_j)_{1\leq j \leq s}$ is linearly independent.
The $b_k,x_j,y_j$ form a basis of ${\mathbb R}^n$ when taken together. Then, the scalar product for which this basis is orthonormal is preserved by $A$.