1

Lets say I have a random variable with values in the space of square binary matrices from which I can sample (adjacency matrices of) graphs, and lets say that the resulting graphs have a power law degree distribution (in expectation).

Are the sampled graphs 'sparse' (the number the number of edges follows $O(n)$, where $n$ is the number of nodes)?

Similarly, can I say anything about their being 'dense' (the number of edges follows $O(n^2)$ )?


Edit: Following @manuellafond's comment I see the question needs further clarification:

A power law degree distribution means that the probability $P(k)$ of a node having degree $k$ follows $P(k)=Ck^{-\gamma}$ for some $C$ and $\gamma$. If we further stipulate that $\gamma$ is the same for all graph sizes $n$, $C$ must be a function of $n$ as follows:

$C(n) = \frac{1}{\sum^{n-1}_{k=1}k^{-\gamma}}$

To be honest, I'm now not 100% sure that such a setup is at all possible, but I think so. This question is an attempt to formalise the same for social networks, which typically have power law degree distributions.

drevicko
  • 345
  • 1
    Does power law degree mean that the probability $P(k)$ that a given node is of degree $k$ is $P(k) = k^{-\gamma}$, for some constant $\gamma$ ? – Manuel Lafond Jun 11 '14 at 20:02
  • You'd need a normalising constant $P(k)=Ck^{-\gamma}$, but yes, that's right. Thanks for clarifying. – drevicko Jun 12 '14 at 08:36

1 Answers1

1

Since no one is answerng, here's what I get, assuming $\gamma$ is a constant. As you say then $C = \frac{1}{\sum_{k = 1}^{n - 1} k^{-\gamma}} = \frac{1}{H^{\gamma}_{n - 1}}$, ${H^{\gamma}_{n - 1}}$ being the generalized Harmonic number.

Let $m$ be the number of edges of your graph $G = (V, E)$. We'll treat $m$ as a random variable. Denote the degree of some vertex $v \in V$ by $d(v)$.

We'd like to find out the order of $\mathbb{E}[m]$. Using the handshake lemma,

$$\mathbb{E}[m] = \mathbb{E}[ \frac{1}{2} \sum_{v \in V} d(v)] = \frac{1}{2} \sum_{v \in V} \mathbb{E}[d(v)] = \frac{1}{2} n \mathbb{E}[d(v)]$$ (for some $v \in V$).

So it all comes down to the expected degree. For some $v \in V$, we get $$\mathbb{E}[d(v)] = \sum_{k = 1}^{n - 1} k P(d(v) = k)= \sum_{k = 1}^{n - 1} k C k^{-\gamma} = C \sum_{k = 1}^{n - 1} \frac{1}{k^{\gamma - 1}} = \frac{H_{n-1}^{\gamma - 1}}{H^{\gamma}_{n-1}}$$

Now, ${H^{\gamma}_{n - 1}}$ is $O(n)$ when $\gamma = 0$, $O(\log n)$ when $\gamma = 1$ and $O(1)$ when $\gamma > 1$.

So if $\gamma = 1$, $n \frac{H_{n-1}^{\gamma - 1}}{H^{\gamma}_{n-1}} \in O(n \frac{n}{\log n}) = O(\frac{n^2}{\log n})$.

If $\gamma = 2$, $n \frac{H_{n-1}^{\gamma - 1}}{H^{\gamma}_{n-1}} \in O(n \log n)$.

If $\gamma > 2$, $n \frac{H_{n-1}^{\gamma - 1}}{H^{\gamma}_{n-1}} \in O(n)$.

It remains to find the order for $1 < \gamma < 2$.

  • good try, but $C$ or $\gamma$ (or both) must necessarily depend on $n$ in order to make $P(d(v)=k)$ into a probability distribution. If we fix $\gamma$, we end up with $\mathbb{E}[d(v)] = \frac{\sum k^{1-\gamma}}{\sum k^{-\gamma}}$, which I'm not sure what to do with... – drevicko Jun 12 '14 at 17:29
  • Ah sorry I didn't see your edit. It feels like we can treat $C$ as a constant since it converges as $n$ gets larger. Specifically, $\lim_{n \rightarrow \infty} \sum_{k = 0}^{n - 1} k^{-\gamma} = \zeta(\gamma)$, the Riemann zeta function which is known to converge to some constant $c$ whenever $\gamma > 1$. So $1/\sum_{k = 0}^{n - 1} k^{-\gamma} \rightarrow 1/\zeta(\gamma) $ should converge as well. But this might need details to fill in. – Manuel Lafond Jun 12 '14 at 18:30
  • Yes, $C(n)$ decreases monotonically, so in the inequality for $\mathbb{E}[d(v)]$, we can replace $C$ by its first term $C(2)=1$. I've edited the answer to accordingly. Thx (: – drevicko Jun 12 '14 at 21:36
  • Your edit have been rejected it would seem...(not me !) Anyway I still have a little doubt. That $C$ is decreasing does make $O(n log n)$ a correct bound, but it could be less. I mean, let's say for the sake of arguing that $C = 1/\log n$. It's still decreasing, but it cancels out the $\log n$ in $n \log n$, and $C$ can possibly change the order of the number of edges. I'll think about it... – Manuel Lafond Jun 12 '14 at 23:47
  • For $\gamma>2$, $\mathbb{E}[d(v)]$ converges as $n\rightarrow \infty$, so is in $O(1)$, or have I missed something?? For $\gamma=2$, it's as $H_{n-1}\in O(\log n)$ (I'm guessing we can't do much better there). – drevicko Jun 13 '14 at 14:57
  • And for $\gamma=1$ we end up with $(n-1)/H_{n-1} \in O(n/\log n)$ (for that we need to lower bound $H_{n-1}$ with $\log n$, but I think we can, or no?). In summary, we would now have $\mathbb{E}[m] \in O(n), O(n^2\log n), O(n^2/ \log n)$ for $\gamma >2$, $\gamma=2$ and $\gamma=1$ respectively. Thoughts? – drevicko Jun 14 '14 at 10:08
  • For $\gamma = 1$, yeah. For $\gamma = 2$, I don't know where you get the extra $n$ in $n^2 \log n$ (it's more than the max. number of edges !). I think it should be $O(n \log n)$ under that reasoning. But ultimately, $E[m] = \frac{H^{\gamma - 1}{n - 1}}{H^{\gamma}{n - 1}} \times n$, where $H^{\gamma}{n - 1}$ is the generalized Harmonic function. It all depends on the order of the ratio $\frac{H^{\gamma - 1}{n - 1}}{H^{\gamma}_{n - 1}}$, and I can't seem to find out about it in $O$ terms. – Manuel Lafond Jun 16 '14 at 17:00
  • Oops! You're right, an extra $n$ snuck in for $\gamma=2$. I was thinking of a potential lower bound $H^\gamma_n\ge M\log n$ for some $M$ so $1/H_n\in O(1/\log n)$, but now that I think more, that seems unlikely.. Do you agree for $\gamma\gt 2$? – drevicko Jun 17 '14 at 14:17
  • Actually I was about to ask the question and found this : http://math.stackexchange.com/questions/776335/determining-the-asymptotic-order-of-growth-of-the-generalized-harmonic-function. Commenter says $H^{\gamma}$ is $o(1)$ for $\gamma > 1$, which after thought sounds right. So all the bounds you mention in your previous comment are OK (minus the extra $^2$), and I think the case is finally closed :) – Manuel Lafond Jun 18 '14 at 02:42
  • I think the $\gamma=1$ case has a problem: $H^1_n\in O(\log n)$ does not imply $1/H^1_n\in O(1/\log n)$. To be in $O(1/\log n)$ we need an upper bound $1/H^1_n\le M \frac{1}{\log n}$ for some $M$ (and $n\gt n_0$ for some $n_0$), and so a lower bound $H^1_n \ge M' \log n$ (eg: $M'=1/M$). It's not clear that we can do that. – drevicko Jun 18 '14 at 09:57
  • Or maybe we can: http://mathhelpforum.com/calculus/48278-bounds-harmonic-series.html#post184347, http://rgmia.org/papers/v6n2/harmonic-ser.pdf and http://math.stackexchange.com/questions/620400/how-to-bound-harmonic-numbers – drevicko Jun 18 '14 at 10:03
  • Apostol's Introduction to Analytic Number Theory, Theorem 3.2 (b) (ref'd here) gives an upper bound, but not a lower one... – drevicko Jun 18 '14 at 12:55
  • From what I know, actually $H^1_n \in \theta(\log n)$, which implies there exist constants $c$ and $d$ such that $c/\log(n) \leq 1/H^1_n \leq d / H^1_n$. So both sides of the bound should be fine. Some details here : http://math.stackexchange.com/questions/306371/simple-proof-of-showing-the-harmonic-number-h-n-theta-log-n – Manuel Lafond Jun 18 '14 at 17:18
  • Ok, so we're settled for $\gamma\ge 2$ and $\gamma = 1$. Care to add the results to the answer? (edits I make will probably not be accepted). There are probably similar results for $1\lt\gamma\lt 2$, based around an analysis of $O(1/H^\gamma_n)$ (a proof could probably be modelled on that for $\gamma=1$), but I'm satisfied with what we have already. Thx! It's been fun to put my head back in math mode for a moment :) – drevicko Jun 19 '14 at 11:05
  • I edited my answer. Yeah that was actually quite instructive. Good thing you remained persistent on the details of this question ! – Manuel Lafond Jun 19 '14 at 16:38
  • Great! You should add that it's known that $1/H^1_n\in O(1/{\log n})$, as we can find a lower bound linear in $\log n$, perhaps with references.. Maybe also worth noting that finding a similar lower bound for $H^\gamma_n$ with $1\lt\gamma\lt 2$ would resolve the question for those values. Year, needless attention to detail is my downfall! Perhaps I should return to maths ;) – drevicko Jun 20 '14 at 09:05