I'm interested in all four combinations: directed and undirected, cyclic and acyclic.
I'm having trouble calculating how big the complexity gets as you add more nodes to a graph. Clearly, the number of possible graphs goes up with adding directability, and wildly (adding Ω(2n) complexity, roughly).
My best guess on a DAG is close to Ω(n!).
This question concerns itself with knowledge representation. How many neural networks are there with n neurons? Given that different knowledge must be encoded differently, it gives some sort of data about how knowledge can scale in the brain.
[Edit: "multigraphs", obviously, aren't part of the question, disconnected graphs should count as their lower order counterparts, and v1 is separate from v2 such that a set V containing both has 3 DAGs.]
[Edit2: Looks like for DCGs, it is about 23n. For DAGs, it's about 22n.]
[Note: I tagged this under "descriptive complexity" because it's not really a simulation. Let me know if this is wrong.]