1

In the Random Energy Model, the Hamiltonian of a spin system $S_{N} = \{ -1,+1 \} ^N$ is defined as $H_{N}(\sigma) = -\sqrt{N}X_{\sigma}$, where the $X_{\sigma}$ are i.i.d normally distributed. We are interested in the behaviour of the expectation of $\Phi$, which is defined as: \begin{equation*} \Phi_{\beta, N} = \frac{1}{N} ln Z_{\beta, N} \end{equation*} With Z the partition function: \begin{equation*} Z_{\beta, N} = \frac{1}{2^N} \sum\limits_{\sigma \in S_{N}}^{} e^{\beta \sqrt{N} X_{\sigma}} \end{equation*}

In this formulation of the model, the expectation of $\Phi$ has a very interesting property: it turns out that the law of large numbers holds up to a certain level of $\beta$ which we call the critical value $\beta_{c}$. Thus, up to this point it holds that $Z_{\beta, N} \sim E(Z_{\beta, N})$ and for $\beta > \beta_{c}$ a breakdown of the law of large numbers occurs. For a system of normally distributed $X_\sigma$ we have the following ground-state energy : \begin{equation} \lim_{N\to\infty} max_{\sigma \in S_{N}} \frac{X_{\sigma}}{\sqrt{N}} = \sqrt{2ln(2)}, a.s. \end{equation} We want to follow the proof of Anton Bovier (Statistical Mechanics of Disordered Systems, Theorem 9.1.2) that the expectation of $\Phi$ now behaves in the following way: $ \lim_{N\to\infty} E(\Phi_{\beta,N}) = \begin{cases} \frac{\beta^{2}}{2}, & \beta \leq \beta_{c} \\ \frac{\beta^{2}}{2} + (\beta - \beta_{c})\beta{c} ,&\beta \geq \beta_{c} \\ \end{cases} $

From the previous we can derive that $E(\frac{d}{d\beta})\Phi \leq N^{-1/2} E(max_{\sigma \in S_{N}}X_{\sigma})$. Bovier now concludes the follwing two things:

1) $E(\frac{d}{d\beta})\Phi \leq N^{-1/2} E(max_{\sigma \in S_{N}}X_{\sigma}) \leq \beta \sqrt{2ln2} (1+C/N)$ with C a constant.

2) $E(\Phi) \leq \frac{\beta_{0}^2}{2} + (\beta - \beta_{0})\sqrt{2ln2}(1+C/N) = ln{2} + (\beta - \sqrt{2ln2})\sqrt{2ln2}(1+C/N)$

I have no idea of how the $\beta$ and the constant $C$ pop up in the first equation, nor how the second expression follows from the first. Do you have any ideas?

1 Answers1

0

The second inequality in the claim (1) follows from the extreme value statistics of normal random variables (aside from the factor $\beta$, which I will discuss below). One source on this topic is Bovier's own set of lectures notes, which can be found here. What's needed is nicely summarized in this post, namely that the mean of the maximum of $n$ independent normal random variables is $m_n = M_N (1 + O((\log n)^{-1})$, where $M^2_n = \log(n^2 / 2 \pi \log(n^2/2\pi)) \le 2 \log n$.

Letting $n = |S_N| = 2^N$ and dividing by $\sqrt N$, this implies that \begin{equation} N^{-1/2} E \Big(\max_{\sigma\in S_N} X_\sigma\Big) = N^{-1/2} m_{2^N} \le \sqrt{2 \log 2} \big(1 + C/N\big). \end{equation} As you can see $\beta$ is missing. I don't see where $\beta$ would come from as the $X_\sigma$ don't depend on $\beta$. Moreover, the presence of $\beta$ doesn't seem consistent with (2)

It is less clear to me how to get (2). By Jensen's inequality, $E \Phi_{\beta,N} \le \beta^2/2$. Thus, roughly speaking, we should have (for any $\beta_0 \ge 0$) \begin{equation} E \Phi_{\beta,N} \approx E \Phi_{\beta_0,N} + (\beta - \beta_0) \frac{d}{d\beta} E \Phi_{\beta,N}\big|_{\beta_0} \le \frac{\beta_0^2}{2} + (\beta - \beta_0) \sqrt{2 \log 2} (1 + C/N). \end{equation} (From this, you see why we wouldn't expect a $\beta$ in (1): it would lead to an extra factor of $\beta_0$ in the last term above.) However, the $\approx$ is not an upper bound $\le$ since $\frac{d^2}{d\beta^2} E \Phi_{\beta,N} \ge 0$ for all $\beta \ge 0$. Nevertheless, this derivative should be negligible in the limit $N \to \infty$ (I have not done the computation so I'm not sure). In other words, we should be able to get a slightly weaker version of the inequality in (2), but this should suffice since we are taking the limit in the end.

Ben
  • 767