What makes a law? There exist tendencies, theories, and hypotheses that attempt to predict the future based on past behavior. However, sufficient experience is necessary to move solidly into the realm of “law.”
Experience represents not just thoughts, or theory, but instead the knowledge that because one is relying on data and devising a formula that says, all things being equal, and there’s no reason to suspect why they wouldn’t be, this formula will continue to hold at least in the near term. One example rich with experience is Moore’s Law.
In 1965, Intel founder Gordon Moore observed an emerging trend and concluded that computing would dramatically increase in power, and decrease in relative cost, at an exponential pace. Initially, Moore’s statements were economic ones with Moore first saying that, with the new integrated circuits, “the cost per component is nearly inversely proportional to the number of components,” predicting that the number of components on an integrated circuit would double every year until it reached 65,000 by 1975. When it proved correct in 1975, Moore revised his prediction – to what has since become known as Moore’s Law – to a doubling of transistors on a chip every two years and advanced chips currently boast 65 billion transistors.
According to Robert Sutor, chief quantum exponent at IBM, Moore’s Law has three components dealing with an approximate two-year period:
- For classical processors, we can do twice as much with them. It’s a matter of miniaturization.
- Over two years, the chips themselves were half as big
- Approximately every two years, the chips would use half as much energy
“The fact is, when they’re talking about building physical entities like semiconductors and transistors, you can only go so small. Atoms have a certain size and you’re approaching the almost atomic molecular distances at this point. You’re limited by that.”
Sutor also explained that these limitations made the industry increasingly clever. When an individual processor reaches its limit on how fast it can be, for example, we used more of them – multiplying the processing power in lots of different ways.
Does a quantum version exist?
According to Sutor, the number of quantum bits contain more information than classical computing bits and follow bizarre laws of quantum mechanics. There are also three important elements here. This non-classical behavior means that Moore’s Law for classical processors cannot be applied to quantum processors. Qubits have a weird property called entanglement. By adding one more qubit to a system you double the amount of information that your quantum system can compute.
It goes like this: One qubit has two bits of information. Two qubits have four, three qubits have eight, and four qubits have 16, etc. By the time we get to 10 there are 1,024 bits of information and at 20 qubits, you can manipulate roughly one million bits of information. This exponential growth, the doubling effect, is what you get with qubits. This is, as with Moore’s Law, exponential growth.
A major difference, however, is when we add more RAM to a laptop, there are physical locations for those bits. With a quantum system, there is no corresponding physical space where these qubits working together represent the information. It’s very non-binary behavior.
The second element is quality. IBM developed a metric that is gaining traction called Quantum Volume. It’s a measure of the quality of a system. How well are the qubits working together? The higher the quantum volume, the better the quality.
Click for a larger image. (Source: IBM Quantum)
“The reason why you even have these problems is with quantum computers we’re trying to imitate what nature does. Every single electron is a quantum particle. Every single photon is a quantum particle. Every one of them wants to interact with your qubit because they’re all quantum things. They want to entangle themselves with each other because the environment we live in at the very small level is quantum — that’s the definition. The quality reflects how we can reduce the noise so that the calculations can proceed correctly, and we can keep using the qubits long enough until they become chaotic,” explained Sutor.
The third element is speed. Having a slow quantum system, or slow qubits, pretty much defeats the effort. These three areas in quantum computing are akin to Moore’s Law, being exponential in nature.
When the question of Moore’s Law for quantum computing was posed to Seeqc Inc.’s president John Levy, he was quick to point out that there is little correlation. “So, what you’re really trying to get at isn’t Moore’s Law, it’s trying to understand whether or not there are metrics in quantum computing that can help to predict capacity for performance, cost, or whatever,” Levy said.
Levy explained that one of the issues within quantum computing is that the whole notion of benchmarking is embryonic and cautions that “all of the players tend to develop benchmarks that favor their particular technology. Hence, caution is required when thinking about a similar benchmark as Moore’s Law for qubits, as people will continue proposing what makes them look good.”
Examples of that tendency can be seen with two more laws that continue to be discussed periodically.
Rose’s Law came out of D-Wave that provides quantum annealing systems rather than universal quantum computers. Rose’s Law suggests quantum computing qubits should double every two years and is close to or already moving faster than Moore’s Law. At that time, Rose predicted that he would be able to demonstrate a two-bit quantum computer within six months. It’s important to note that D-Wave is now changing focus. See Fortune article: D-Wave took its own path in quantum computing. Now it’s joining the crowd.
Neven’s Law proposed by Hartmut Neven, director of Google’s Quantum AI Lab, provides a theory on the potential improvement capabilities of quantum computers that states: “Quantum computing power is improving at a doubly exponential growth compared to conventional computing.” This growth means that quantum computing power is growing by powers of powers of two [2^2 (4), 2^4 (16), 2^8 (256)]. While Neven’s prediction is enticing, it relies on limited data and doesn’t consider technical problems as processors become more complex. Computational errors due to electrical noise and the impact of additional backup hardware also taint Neven’s prediction.
According to Quantum Computing Report’s Doug Finke, in Double Exponential Growth Can Be Easily Misinterpreted, Finke reminds us that while there are segments of these “laws” that are correct “. . . and an important consideration when trying to simulate a quantum computer on a classical computer, it can be easily misinterpreted if one is trying to understand the future improvement quantum computers will bring to solving specific problems. Normally, when one talks about the growth of a technology, the comparison is usually a comparison against the previous generation of the same technology.”
While for quantum computing, there is no universal benchmarking, there is serious work that is looking at potential benchmarks. NIST and International Labs are two standards organizations that will likely create standards down the road, for example. Other industry organizations will likely join the effort.
For now, however, one of the best quotes on Moore’s Law is from an IEEE Spectrum article on Carver Mead: “I always had to — especially in the early days — explain that [Moore’s Law] is not a law of physics. This is a law [of] the way that humans are. In order for anything to evolve like our Semiconductor technology has evolved, it takes an enormous amount of creative effort by a large number of smart people. They have to believe that effort is going to result in a successful thing, or they won’t put the effort in. That belief that it’s possible to do this thing is what causes the thing to happen. The Moore’s Law thing is really about people’s belief in the future and their willingness to put energy into causing that thing to come about. It’s a marvelous statement about humanity.”
There is no less belief in the future today. If we use Mead’s definition as to what Moore’s Law is, maybe that covers the rest of the available theories as well.
About the author
David Brown is the president of Berkeley Nucleonics Corporation, a leading manufacturer of precision electronic instrumentation for test, measurement, and nuclear research. The company’s equipment is used for signal generation in a myriad of applications including quantum computing. Berkeley Nucleonics also offers courses covering a wide variety of topics including quantum computing.
about Berkeley Nucleonics