I was trying to solve the same problem as in this question and ended proving something stronger. However, what I've proven cannot be true, since it is known (taking the $X_k$'s to be independent N(0, 1) random variables) that the expectation in the title is at least logarithmic in $N$.
First, I need two lemmas:
Suppose $\{Y_n\}$ is a sequence such that $|Y_n| \uparrow |Y| \in L_\psi$ (the Orlicz space for a given $\psi$). Then $$ \lim_n \|Y_n\|_\psi = \|Y\|_\psi. $$
Since the Orlicz norm is monotone ($|X| \le |Y|$ implies $\|X\|_\psi \le \|Y\|_\psi$), the limit above exists and is at most $\|Y\|_\psi$. The other side follows because \begin{align} \|Y\|_\psi &= \inf\{t > 0: \mathbb{E} \psi(|Y|/t) \le 1\}\\ &= \inf\{t > 0: \lim_n \mathbb{E} \psi(|Y_n|/t) \le 1\}, \end{align} and $\mathbb{E}\psi(|Y_n|/t_0) \le 1$ for all $n$, where $t_0 = \lim_n \|Y_n\|_\psi$, by monotonicity again.
For any random variables $X_1, \dots, X_N$ and $p, q \in [1, \infty)$, we have $$ \left\| \left\|(X_k)_{k=1}^n\right\|_{\ell^p} \right\|_{L^q} \le \left\|\left(\|X_k\|_{L^q}\right)_{k=1}^N\right\|_{\ell^p} $$
This is simply a computation: \begin{align} \left\| \left\|(X_k)_{k=1}^N\right\|_{\ell^p} \right\|_{L^q} &= \left(\mathbb{E}\left\|(X_k)_{k=1}^N\right\|_{\ell^p}^q\right)^{1/q}\\ &= \left(\mathbb{E}\left(\left\|(X_k)_{k=1}^N\right\|_{\ell^p}^p\right)^{q/p}\right)^{1/q}\\ &= \left[\left(\mathbb{E}\left(\left\|(X_k)_{k=1}^N\right\|_{\ell^p}^p\right)^{q/p}\right)^{p/q}\right]^{1/p}\\ &= \left[ \left\| \left\| (X_k)_{k=1}^N \right\|_{\ell^p}^p \right\|_{L^{q/p}} \right]^{1/p}\\ &= \left[ \left\| \sum_{k=1}^N |X_k|^p \right\|_{L^{q/p}} \right]^{1/p}\\ &\le \left[ \sum_{k=1}^N \left\| |X_k|^p \right\|_{L^{q/p}} \right]^{1/p}\\ &= \left[ \sum_{k=1}^N \left\| X_k \right\|_{L^q}^p \right]^{1/p}\\ &= \left\|\left(\|X_k\|_{L^q}\right)_{k=1}^N\right\|_{\ell^p}. \end{align}
Now, suppose $X_1, \dots, X_N$ are subgaussians. Let $K = \max_k \|X_k\|_{\psi_2}$. By property of subgaussians, there is some universal constant $C$ such that their moments are bounded: $$ \|X_k\|_{L^q} \le C \|X_k\|_{\psi_2} \sqrt{q} \le C K \sqrt{q}, $$ for each $k$ and all $q \in [1, \infty)$. Using the first lemma and monotonicity of $p$-norms for probability spaces (we need to divide the $\ell^p$ norm by $N^{1/p}$ for it to become a probability measure, but this factor gets lost in the limit): \begin{align} \|\max_k |X_k|\|_{\psi_2} = \lim_{p \to \infty} \| \|(X_k)_{k=1}^N\|_{\ell^p} \|_{\psi_2}. \end{align} Using the second lemma, the moments of $\|(X_k)_{k=1}^N\|_{\ell^p}$ are bounded by: $$ \| \|(X_k)_{k=1}^N\|_{\ell_p} \|_{L_q} \le \| (\|X_k\|_{L^q})_{k=1}^N \|_{\ell^p} \le CK \sqrt{q}N^{1/p}. $$ Hence $\| \|(X_k)_{k=1}^N\|_{\ell^p} \|_{\psi_2}$ is at most a constant factor, say $C_0$, times $CKN^{1/p}$ (by property of subgaussians, again). Using a previous equation, we see that $$ \|\max_k |X_k|\|_{\psi_2} \le C_0CK. $$ In particular, it follows that $$ \|\max_k |X_k|\|_{L_1} \le C_0 C^2 K \sqrt{1}. $$