8.17.20

The race to quantum advantage

As we map out the frontiers of innovation, quantum computing (QC) looms large as possibly the most important scientific invention of the 21st century. QC upends what we thought to be fundamental principles of computer science, promising to execute entirely new classes of algorithms beyond the practical capabilities of conventional computers. In the coming decades, quantum computers will solve currently intractable problems in logistics, chemistry, drug development, materials science, climate modeling, cryptography, human cognition, and more.

The race is on to bring this technology to market. Significant progress has been made in recent years, and indeed quantum computers of various kinds work today; however, they are still limited in scale and functionality, solving only simple problems that conventional computers can handle far more easily and cheaply. The goal of the QC race is to build computers that solve commercially valuable problems beyond the capabilities of conventional computers, an achievement called quantum advantage.

In the race to quantum advantage, universities compete to discover physical mechanisms and craft the new algorithms. Corporations compete for the advantage of being first to apply QC in their industries. Governments compete to harness QC for national security and to reap economic windfalls from this new industry. Tech companies compete to build and to commercialize the technology, aspiring to grow into the next Intel, Microsoft, Google, or Amazon. And venture investors like us compete to identify the winning startups.

QC 101

Quantum computers exploit the seemingly bizarre yet proven nature of the universe that until a particle interacts with another, its position, speed, color, spin, and other quantum properties co-exist simultaneously as a probability distribution over all possibilities in a state known as superposition. Quantum computers use isolated particles as their most basic building blocks, relying on any one of these quantum properties to represent the state of a quantum bit (or “qubit”). So while classical computer bits always exist in a mutually exclusive state of either 0 (low energy) or 1 (high energy), qubits in superposition co-exist simultaneously in both states as 0 and 1.

Things get interesting at a larger scale, as QC systems are capable of isolating a group of entangled particles, which all share a single state of superposition. While a single qubit coexists in two states, a set of eight entangled qubits (or “8Q”), for example, simultaneously occupies all 2^8 (or 256) possible states, effectively processing all these states in parallel. It would take 57Q (representing 2^57 parallel states) for a QC to outperform even the world’s strongest classical supercomputer. A 64Q computer would surpass it by 100x (clearly achieving quantum advantage) and a 128Q computer would surpass it a quintillion times.

In the race to develop these computers, nature has inserted two major speed bumps. First, isolated quantum particles are highly unstable, and so quantum circuits must execute within extremely short periods of coherence. Second, measuring the output energy level of sub-atomic qubits requires extreme levels of accuracy that tiny deviations commonly thwart. Informed by university research, leading QC companies like IBM, Google, Honeywell, and Rigetti develop quantum engineering and error-correction methods to overcome these challenges as they scale the number of qubits they can process.

Following the challenge to create working hardware, software must be developed to harvest the benefits of parallelism even though we cannot see what is happening inside a quantum circuit without losing superposition. When we measure the output value of a quantum circuit’s entangled qubits, the superposition collapses into just one of the many possible outcomes. Sometimes, though, the output yields clues that qubits weirdly interfered with themselves (that is, with their probabilistic counterparts) inside the circuit. QC scientists at UC Berkeley, University of Toronto, University of Waterloo, UT Sydney, and elsewhere are now developing a fundamentally new class of algorithms that detect the absence or presence of interference patterns in QC output to cleverly glean information about what happened inside.

The QC stack

A fully functional QC must, therefore, incorporate several layers of a novel technology stack, incorporating both hardware and software components. At the top of the stack sits the application software for solving problems in chemistry, logistics, etc. The application typically makes API calls to a software layer beneath it (loosely referred to as a “compiler”) that translates function calls into circuits to implement them. Beneath the compiler sits a classical computer that feeds circuit changes and inputs to the Quantum Processing Unit (QPU) beneath it. The QPU typically has an error-correction layer, an analog processing unit to transmit analog inputs to the quantum circuit and measure its analog outputs, and the quantum processor itself, which houses the isolated, entangled particles.

As QC evolves over the coming decades, vendors will carve out their own blocks of this evolving stack, but today’s pioneers do not have the luxury of such focus. Today’s QC systems are still too complex, unique, and finicky to expose to the ecosystem. The teams who develop a proprietary, integrated stack will likely be first to achieve practical quantum advantage.

QC modalities

The various competing architectures for quantum processors exploit a wide range of quantum particle properties (“modalities”) to represent qubits. Each prevailing architecture has different advantages around coherence times, measurement fidelity, design scalability and operating scalability. The industry has consolidated around a few that currently offer the most immediate and clear path to large-scale systems; chief among them is superconducting qubits. This approach is used by the three current leaders in the race – IBM, Google and Rigetti – and relies on circuits of superconducting material kept in cryogenic temperatures. It offers superior latency (~50 nanoseconds per operation) compared to other modalities but limited coherence times (qubits remain entangled for only ~50 microseconds). Recent demonstrations of intermediate-scale systems suggest it will be the first modality to enable practical quantum advantage.

Meanwhile, Honeywell, IonQ, and others are pursuing another leading modality, trapped ion qubits, which relies on a combination of electronic and magnetic fields to capture charged particles in an isolated system. This architecture faces challenges around its scale-out potential and latency, but offers a significantly superior coherence, as its qubits can stay entangled for nearly a minute. Other frontier modalities at earlier stages of research and commercialization – such as cold atom arrays, photonics-based qubits, and topological qubits – may eventually succeed with their own sets of advantages, arriving later to market.

A proliferation of distinct QC systems could eventually enable users and organizations to choose different modalities for different use-cases. But as with other technology markets, the first modality upon which quantum advantage is demonstrated could well dominate the field for generations, primarily as future investments naturally flow towards the most mature technologies. As of 2020, superconducting qubits seem poised to enable the first practical cases of quantum advantage.

The race is a marathon

In 1980, at a series of lectures at MIT and Caltech, Berkeley Professor Paul Benioff defined the concept of a quantum circuit, and Richard Feynman floated the basic model for building such a computer. Commercial development ramped up only recently, though in 1999, D-Wave began its successful development of a 2000Q annealer for solving specialized optimization problems.

Significant commercial progress toward generalized quantum advantage was first demonstrated in 2019, suggesting an inflection point in quantum tech. Within the span of a few months, three superconducting qubit vendors – IBM, Google, and Rigetti – presented working systems in the 32Q-54Q range (at 95%-99% fidelity) with credible roadmaps for building 100Q+ systems capable of demonstrating quantum advantage. Trapped Ion vendors Honeywell and IonQ presented longer-term strategies to reach a similar scale with coherence times that accommodate more complex computations. PsiQuantum, an early pioneer of the photonics modality, raised a $230M venture round. And perhaps most notably, Amazon, IBM, and Microsoft announced they will distribute quantum computing services as part of their commercial cloud offerings.

With dozens of corporates and startups pursuing the development of quantum technology, it is difficult to say who will first demonstrate quantum advantage or when such a demonstration will first occur. Many predict that it will happen within the next few years, possibly as early as 2021.

Of course, the end of this race marks the start of another. In the coming decade, QC vendors will compete by increasing scale and minimizing errors. The near-term winners will likely employ hybrid architectures, leveraging both quantum and classical compute units for their relative advantages, and tailor their products to industries that require incremental improvements, rather than fully accurate results, such as supply-chain management, optimization of logistical distribution, financial analysis and seismic research.

By the 2030s, a small number of leading vendors should hopefully be far enough to develop scalable, error-free QCs with high coherence, enabling QC software vendors to tackle all the problems currently addressed by classical supercomputers for pharmaceuticals, finance, chemical engineering, AI, agriculture and logistics.

We shouldn’t expect quantum computers in our homes or the palms of our hands. But the immense computational power that QC can unleash will lead to life-changing medicines, more accurate weather forecasts, smarter AIs, longer-lasting batteries, safer space travel, sustainable energy sources, and benefits society has yet to even discover. As we approach the age of quantum computing, it is no longer a question of ‘if,’ but rather one of ‘when’ this technology finally matures and ‘who’ will lead this emerging industry. The race is on—may the best team win.

David Cowan and Tomer Diari invest in frontier tech startups on behalf of Bessemer Venture Partners, the largest shareholder of Rigetti Computing.