5

I want to simulate an arbitrary isolated quantum circuit acting on $n$ qubits (i.e. a pure state of $n$ qubits).

As I know RAM is the bottleneck for quantum simulators, you can consider a "normal" computer to have between $4$ and $8$ Gio of RAM, all the other components are considered sufficiently powerful to not be the bottleneck.

With this definition of a "normal" computer,

What is the maximum value of $n$ (the number of qubits) for which an arbitrary quantum circuit is simulable in a reasonable time ($<1\text{h}$) with a normal computer and freely accessible simulators?

Sanchayan Dutta
  • 17,497
  • 7
  • 48
  • 110
Adrien Suau
  • 4,927
  • 20
  • 58

2 Answers2

5

This answer doesn't directly answer the question (I have little experience of real simulators with practical overheads etc.), but here's a theoretical upper bound.

Let's assume that you need to store the whole state vector of $k$ qubits in memory. There are $2^n$ elements that are complex numbers. A complex number requires 2 real numbers, and a real number occupies 24 bytes in python. Let's say we want to cram this into $4\times 10^9$ bytes of RAM (probably leaving a few over for your operating system etc.) Hence, $$ 48\times 2^n\leq 4\times 10^9 $$ Rearrange for $n$ and you have $n\leq26$ qubits.

Note that applying gates in a quantum circuit is relatively inexpensive memory-wise. See the "Efficiency Improvements" section in this answer. From that strategy, one should be able to estimate the time it takes to apply a single one- or two-qubit gate to an $n$-qubit system, and hence how many gates you might expect to fit within some times limit (an hour is very modest, but would certainly serve for illustrative purposes).

DaftWullie
  • 57,689
  • 3
  • 46
  • 124
  • I'd say that's a fairly accurate estimate: All the memory you need is basically the one used to store the state. – Norbert Schuch Jul 24 '18 at 21:10
  • Where do those 24 bytes come from, given that a usual double has 64 bits? – Norbert Schuch Aug 15 '18 at 22:24
  • @NorbertSchuch I Have no idea! – DaftWullie Aug 16 '18 at 05:29
  • Could it be that it's just not true? – Norbert Schuch Aug 17 '18 at 23:58
  • @NorbertSchuch it does seem to be what is consistently claimed online for recent versions of python. There’s nothing wrong with it using 3 times as much memory as you might hope. It’s not like it’s as weird as 24 bits would be. – DaftWullie Aug 18 '18 at 05:12
  • https://stackoverflow.com/questions/9395758/how-much-memory-is-used-by-a-numpy-ndarray -- ??? – Norbert Schuch Aug 18 '18 at 11:36
  • @NorbertSchuch so presumably the add-ons can build in more efficient data structures that I didn’t take into account. In which case, you can probably add 1 to the bound. – DaftWullie Aug 25 '18 at 17:08
  • I rather suspect that ONE double takes 24 bytes -- 8 bytes + some control structure -- while a VECTOR of doubles takes 8 bytes per double + some extra control structure. I very much doubt it has to do with numpy, I'd rather suspect the same is true for vectors/matrices of doubles in regular python. In any case, obviously all of these factors don't really matter much (doesn't hurt to get them correct though). – Norbert Schuch Aug 25 '18 at 23:00
-1

Along with depending on time constraints, as Craig mentioned, you also need to specify how accurate/what gates you want the simulation to have. CHP (CNOT, Phase, Hadamard) simulations can do incredibly large circuits with large numbers of qubits incredible quickly, however they only allow a certain restricted gate set, so some gates, such as T gates, must be approximated.

Other simulations exist (such as quantumsim and others) which store full density matrices, and as a result are much more significantly limited in the number of qubits they work with seeing as they must store a $2^n \times 2^n$ matrix.

Dripto Debroy
  • 1,796
  • 8
  • 12