0

Let's say you want to generate all images of size NxN, black/white pixels.
This is equivalent to counting from 0 to 2^(N^2)-1.

On a classical computer, this is impossible for even 32x32=1024 bits.

Can a quantum computer count up by 1 faster than a classical computer?

glS
  • 24,708
  • 5
  • 34
  • 108
Bob
  • 111
  • 3

1 Answers1

6

A classical computer takes less than a nanosecond to increment a 64 bit integer.

A superconducting quantum computer takes 10-100 nanoseconds to perform a CNOT gate, the reversible equivalent of a XOR gate. It takes hundreds of CNOT gates, and other gates, to perform a 64 qubit increment. And the result will be quite noisy, because the gates have error rates of around 0.1%.

In my paper on factoring, performing a fault tolerant addition under superposition on 2000-qubit registers is estimated to take around 20 milliseconds. That's roughly the time it takes a typical video game to render a whole frame. For 1 addition. On a building sized machine.

So, no, quantum computers are not faster at incrementing. They are in fact thousands of times to billions of times slower at incrementing. They get their advantage by doing fewer operations, not by doing the individual operations faster.

enter image description here

Craig Gidney
  • 36,389
  • 1
  • 29
  • 95
  • Then how did Google perform in 200 seconds what IBM’s computer would take 10,000 years? – Bob May 12 '23 at 04:30
  • 1
    @Bob by doing fewer operations. – Craig Gidney May 12 '23 at 06:53
  • 1
    @bob because quantum and classical computers are not a 1:1 match where classical computers are just slower. Unlike NPN vs. CMOS, they do things completely different not just faster. And google picked a problem that is very suited for quantum computing and very hard for classical computers. Its like asking why a guy with a shovel is slower at digging a large hole than a meteor when the shovel guy is so much faster at getting the shovel out of the garage ;) – Max May 12 '23 at 06:59