I needed something along those lines recently and didn't have a specific reference to cite, so here is a proof of a self-contained statement implying yours.
Theorem. Let $X_1,...,X_n$ be $\sigma^2$-subgaussian random variables with mean zero. Then
$$
\mathbb{E}[\max_{1\leq i\leq n} X_i] \leq \sqrt{2\sigma^2\log n} \tag{1}
$$
and, for every $t>0$,
$$
\mathbb{P}\!\left\{\max_{1\leq i\leq n} X_i \geq \sqrt{2\sigma^2(\log n + t)}\right\} \leq e^{-t}\,. \tag{2}
$$
Proof.
The first part is quite standard: by Jensen's inequality, monotonicity of $\exp$, and $\sigma^2$-subgaussianity, we have, for every $\lambda \in \mathbb{R}$,
$$
e^{\lambda \mathbb{E}[\max_{1\leq i\leq n} X_i]}
\leq \mathbb{E}e^{\lambda \max_{1\leq i\leq n} X_i}
= \mathbb{E} \Big[ \max_{1 \leq i \leq n} e^{\lambda X_i} \Big]
\leq \mathbb{E} \left[ \sum_{i=1}^n e^{\lambda X_i} \right]
= \sum_{i=1}^n\mathbb{E}e^{\lambda X_i}
\leq n e^{\frac{\sigma^2\lambda^2}{2}}
$$
so, taking logarithms and reorganizing, we have
$$
\mathbb{E}[\max_{1\leq i\leq n} X_i] \leq \frac{1}{\lambda}\ln n + \frac{\lambda \sigma^2}{2}\,.
$$
Choosing $\lambda := \sqrt{\frac{2\ln n}{\sigma^2}}$ proves (1).
${}$
Turning to (2), let $u := \sqrt{2\sigma^2(\log n + t)}$. We have
$$
\mathbb{P}\{ \max_{1\leq i\leq n} X_i \geq u \}
= \mathbb{P}\{ \exists i,\; X_i \geq u \}
\leq \sum_{i=1}^n \mathbb{P}\{ X_i \geq u \}
\leq n e^{-\frac{u^2}{2\sigma^2}}
= e^{-t}
$$
the last equality recalling our setting of $u$.
$\square$
Here is now an immediate corollary:
Corollary. Let $X_1,...,X_n$ be $\sigma^2$-subgaussian random variables with mean zero. Then, for every $u>0$,
$$
\mathbb{P}\!\left\{\max_{1\leq i\leq n} X_i \geq \sqrt{2\sigma^2\log n}+ u \right\} \leq e^{-\frac{u^2}{2\sigma^2}}\,. \tag{3}
$$
Proof. For any $u>0$,
$$
\mathbb{P}\!\left\{\max_{1\leq i\leq n} X_i \geq \sqrt{2\sigma^2\log n} + u \right\}
\leq e^{-\frac{u^2}{2\sigma^2} - u\sqrt{\frac{2\log n}{\sigma^2}}}
\leq e^{-\frac{u^2}{2\sigma^2}},
$$
the inequality by choosing $t := \frac{u^2}{2\sigma^2} + u\sqrt{\frac{2\log n}{\sigma^2}}$ and using (2). $\square$
More: the slides of some lecture notes by John Duchi.