As in @Did's answer, we let $f(n)=(\log \log n)^{-1}$. Also, for consistency with the existing literature (and Did's answer), let $M_n=\max\{X_k;1\leq k\leq n\}$.
Theorem: $P\left(\lim_{n\rightarrow\infty}\frac{\log n}{\log \log n}\left(\frac{M_n}{\sqrt{2\log n}}-1\right)\in\left[-\frac{1}{4},\frac{1}{4}\right]\right)=1$ if it exists.
First we state a couple of helpful lemmas for Embrechts, Klüppelberg and Mikosch's "Modeling Extremal Events". The first follows from the Borel-Cantelli lemma; the sketch of the proof of the second is on page 170, with references to the full proof.
Lemma 1 (abridged Theorem 3.5.1 in Embrechts et al.): Suppose $(u_n)$ is non-decreasing. Then $\sum_{n=1}^\infty P(X_1>u_n)<\infty$ implies that $P(M_n>u_n~~\text{i.o.})=0$.
Lemma 2 (abridged Theorem 3.5.2 in Embrechts et al.): Suppose $(u_n)$ is non-decreasing and that the following conditions hold: $P(X_1\geq u_n)\rightarrow 0$ and $nP(X_1\geq u_n)\rightarrow \infty$. Then $\sum_{n=1}^\infty P(X_1>u_n)\exp[-nP(X_1>u_n)]<\infty$ implies that $P(M_n\leq u_n~~\text{i.o.})=0$.
(i.o. here means "infinitely often").
Proof: First we prove the upper bound; and then the lower bound.
For upper bound let $u^u_n=\sqrt{2\log n}+\frac{\sqrt{2}(\frac{1}{4}+\epsilon)\log\log n}{\sqrt{\log n}}$ where $\epsilon>0$. Using standard approximation for the distribution of the tail of a Gaussian random variable $P(X_1>x)\approx\frac{1}{\sqrt{2\pi}x}e^{-x^2/2}$, we obtain:
$$\begin{array}{rcl}P(X_1>u^u_n)&\approx&\frac{\exp\left[-\log n -2(\frac{1}{4}+\epsilon)\log\log n-\mathcal{O}\left(\frac{(\log\log n)^2}{\log n}\right)\right]}{\sqrt{2\pi}\left(\sqrt{2\log n}+\mathcal{O}\left(\frac{\log\log n}{\sqrt{\log n}}\right)\right)}\\
&=&\frac{1}{C_1n(\log n)^{\frac{1}{2}+2(\frac{1}{4}+\epsilon)}}
\end{array}$$
where $C_1=\mathcal{O}(1)$ captures the lower-order terms and the constants.
Using the Cauchy condensation test, one can easily check that:
$$\sum_{n=1}^\infty n^{-1}(\log n)^{-1-2\epsilon}<\infty$$
By Lemma 1, $P(M_n>u^u_n~~\text{i.o.})=0$. A few arithmetic manipulations yield that:
$$\tag{1}P\left(\frac{\log n}{\log \log n}\left(\frac{M_n}{\sqrt{2\log n}}-1\right)>\frac{1}{4}+\epsilon~~\text{i.o.}\right)=0$$
For the lower bound let $u^l_n=\sqrt{2\log n}-\frac{\sqrt{2}(\frac{1}{4}+\epsilon)\log\log n}{\sqrt{\log n}}$ where $\epsilon>0$. Again using the same approximation for the distribution of the tail of a Gaussian random variable, we obtain:
$$\begin{array}{rcl}P(X_1>u^l_n)&\approx&\frac{\exp\left[-\log n +2(\frac{1}{4}+\epsilon)\log\log n-\mathcal{O}\left(\frac{(\log\log n)^2}{\log n}\right)\right]}{\sqrt{2\pi}\left(\sqrt{2\log n}+\mathcal{O}\left(\frac{\log\log n}{\sqrt{\log n}}\right)\right)}\\
&=&\frac{(\log n)^\epsilon}{C_2n}
\end{array}$$
where $C_2=\mathcal{O}(1)$ captures the lower-order terms and the constants.
Clearly $P(X_1>u^l_n)\rightarrow 0$ and $nP(X_1>u^l_n)\rightarrow\infty$, so the conditions necessary for Lemma 2 are satisfied. Now,
$$P(X_1>u^l_n)\exp[-nP(X_1>u^l_n)]=\frac{(\log n)^\epsilon}{C_2 ne^{(\log n)^\epsilon/C_2}}$$
Again employing the Cauchy condensation test we can show that:
$$\sum_{n=1}^\infty \frac{(\log n)^\epsilon}{C_2 ne^{(\log n)^\epsilon/C_2}}<\infty$$
By Lemma 2, $P(M_n<u^l_n~~\text{i.o})=0$. A few arithmetic manipulations yield that:
$$\tag{2}P\left(\frac{\log n}{\log \log n}\left(\frac{M_n}{\sqrt{2\log n}}-1\right)<-\frac{1}{4}-\epsilon~~\text{i.o.}\right)=0$$
Combining (1) and (2) yields the desired result.
Remark: This demonstrates a bound on the limit if the limit exists (in fact, as Did points out, this shows $\limsup$ is $\leq 1/4$ and $\liminf$ is $\geq 1/4$). I wonder if one can determine if the limit in fact converges to some fixed number $C\in\left[-\frac{1}{4},\frac{1}{4}\right]$ or "bounces around" (akin to a sine wave). Perhaps Did's convergence in probability to $-\frac{1}{4}$ can somehow be leveraged here?