Quantum computing has always fascinated me. Since reading Physicist David Deutsch’s Dream Machine article in the New Yorker almost a decade ago, I’ve been enthralled with the possibility of a technological reshuffle, unlocking a new dimension of answers about ourselves, nature, and the universe.

As a lifelong student of physics and computer science, quantum computing represents one of the most critical technologies to support our longevity as humanity. As a Partner at Venrock, a venture capital firm, quantum computing represents the most fundamental speed up in computing power that we have ever encountered, and potential to transform almost every industry and vertical.

The challenge is that quantum computing is always “five to seven years away.”

Always five to seven years away

Every major technology of our era has gone through a perpetual cycle of scientific discovery, engineering for commercial use, and manufacturing for production scale. As new advancements or improvements are discovered, they progress through the same cycle.

  1. Scientific discovery: The pursuit of principle against a theory, a recursive process of hypothesis-experiment.
  2. Engineering: Success of the theory and proof of principle stage graduates to becoming a tractable engineering problem, where the path to getting to a systemized, reproducible, predictable system is generally known and de-risked.
  3. Production: Lastly, once successfully engineered to meet performance, focus shifts to repeatable manufacturing and scale, simplifying designs for production.

Since the theorizing of computing leveraging principles of quantum mechanics by Richard Feynman and Yuri Manin, quantum computing has been in a perpetual scientific discovery state. Reaching proof of principle on a particular architecture or approach, but unable to successfully overcome the engineering challenges to move forward.

These challenges in quantum computing have stemmed from lack of individual qubit control, sensitivity to environmental noise, limited coherence times, limited total qubit volume, overbearing physical resource requirements, and limited error correction.

Welcome to the engineering phase

After years and decades of research and progress, we are entering a new phase.

In the last 12 months, we have seen several meaningful breakthroughs in quantum computing from academia, venture-backed companies, and industry, that looks to have broken through the remaining challenges along the scientific discovery curve. Leading quantum into now becoming a tractable engineering problem.

Just in the last six months, we have seen: Google demonstrated a theoretical path to ‘quantum supremacy.’ A new record set for preparing and measuring qubits, inside of a quantum computer without error. Honeywell announced it will “soon release the most powerful quantum computer yet.” Intel announced “Horse Ridge”, their cryogenic control system-on-chip (SoC). The list goes on. All share similarities in the breaking through the challenges related to individual qubit control and coherence times; meaning creating qubits that can be usable and for long enough time to do something meaningful.

At the same time, modern computing industry leaders have been setting up the foundation for future quantum applications. Microsoft’s Azure Quantum and Amazon’s AWS Braket announced managed cloud services enabling developers to experiment with different quantum hardware technologies. Google launched TensorFlow Quantum, a machine learning framework for training quantum models. Baidu, IBM, and others have followed suit. The pace of releases and announcements are accelerating.

The first generation of commercial quantum computers

Unlike in previous years, we are seeing real results. Tens of qubits, with individual control, at meaningful coherence times, demonstrating exciting capabilities. We are at the brink of a new generation of companies, both new and old, introducing commercial-capable quantum systems that will set the stage for the industry for years to come.

Companies such as Atom Computing* leveraging neutral atoms for wireless qubit control, Honeywell’s trapped Ions approach, and Google’s superconducting metals, have demonstrated first-ever results, setting up the stage for the first commercial generation of working quantum computers.

While early and noisy, these systems, even at just 40–80 error-corrected qubit range at 99.9% fidelity, may be able to deliver capabilities that surpass those of classical computers. Accelerating our ability to perform better thermodynamic predictions, understand chemical reactions, improve resource optimizations, and improve financial predictions.

This means pharma companies will be able to simulate chemical reactions in pursuit of a new vaccine that is not feasible on a classical computer. Solar cell manufactures can surpass the current modeling capabilities for improved conversion efficiency; even a few percentage points can be significant tipping to a fully renewable future. Manufactures dealing with complex assembly lines (auto, pharma, microelectronics), which often are large combinatorial problems, could yield significant improvements in yield and production efficiency.

Examples of hybrid algorithms for 40–80 qubit quantum systems:

1. QAOA (quantum approximation optimization algorithm): a hybrid quantum-classical approach to polynomial and super-polynomial time algorithms for finding “a ‘good’ solution to an optimization problem. Examples of this are traveling salesman optimization problems, where the value of merit is the ratio between the quality of the polynomial-time solution and the quality of the true answer.

Use cases include network optimization, machine scheduling, max cut solver, operations research, image recognition, and electronic circuit design.

2. Quantum Boltzmann ML: Boltzmann machines are particularly well suited for quantum computing architectures because of their heavy reliance on the use of binary variables, specifically in solving challenging combinatorial optimization problems. A neural network designed to run through a Boltzman method could allow for a significant speedup for training and inference of a deep learning model that relies on probabilistic sampling.

Use cases include binary matrix factorization, ML boosting, reinforcement learning, network design, portfolio modeling.

Welcome.

The early quantum systems won’t crack public-key encryption or solve our universes NP-complete problems, yet. We would require thousands to tens of thousands of qubits to account for error correction. They will, however, introduce commercial quantum computing to the world, kick off a domino effect of demand around enterprise quantum readiness and access, accelerate the integration of more modern computing approaches, and forever change the computing landscape for decades to come.

*Venrock is an investor in Atom Computing.


Setting the Stage for a Commercial Era of Quantum Computers was originally published on Medium.

Source: https://medium.com/@ethanjb