Home » Technical Topics » Quantum Computing

Quantum computing’s status and near-term prospects

  • Alan Morrison 

Part I: Background and some indicators of how AI is helping

Quantum computing’s status and near-term prospects

IBM’s Quantum System 2 on display at the Q2B 2024 Silicon Valley conference in December (author’s photo)

I had the opportunity to attend the Quantum to Business (Q2B) 2024 Silicon Valley event in December 2024, courtesy of Allie Kuopus, one of the organizers. It was their eighth year organizing this event, consisting of three full days of information flow and interaction on a topic I had never explored at length. 

I was hoping to soak up enough of the fundamentals at the event, but when beginning to write, I realized I still didn’t understand the basics well enough. Physicist Richard Feynman famously said, “If you can’t explain something in simple terms, you don’t understand it.”

Thus the reason for creating this summary explainer to begin with. I just needed more clarity, and figured clarity is generally in short supply when it comes to such topics.

What is quantum computing?

Quantum mechanics is the study of natural phenomena at the atomic and subatomic particle levels. A quantum is a discrete unit which refers to the minimum amount of any physical property, such as light. A photon, for instance, is a quantum of light, as a primer in TechTarget points out.  (Data Science Central is a unit of Informa TechTarget.)

Quantum computers process digital qubits instead of binary bits. A binary bit represents either 1 or 0. “A qubit, however, can represent a 0, a 1, or any proportion of 0 and 1 in superposition of both states, with a certain probability of being a 0 and a certain probability of being a 1,” according to Microsoft’s primer on the topic. 

When Einstein referred to “spooky action at a distance” in 1947, he was alluding to the nature of particle entanglement and thus the physical influence particles can have on one another in a quantum system. 

The “superposition” of particles in a quantum system refers to how they can exist in multiple states at once. Physicists use a Bloch sphere (named after Felix Bloch, a 1952 Nobel laureate who came up with this method of representation) to visualize superposition within a qubit. This method uses vectors to represent individual quantum states.

Quantum computing’s status and near-term prospects

Bloch-sphere-diagram.svg from Wikimedia Commons

Superposition is behind why QC can use interconnected qubits to process data in what proponents claim is an even more massively parallelized, exponentially accelerated method of computing than classical supercomputing. 

As opposed to quantum mechanics, which has been around now for 100 years, the quantum computing field began in earnest nearly thirty years ago, as Google points out. Feynman, who envisioned the field, died in 1988. 

But the big question to this day is still unanswered: Can QC really do things that classical computing can’t? Supercomputing, after all, hasn’t exactly stood still.

TechTarget boils down the main distinctions between classical computing and QC this way: 

Quantum computing’s status and near-term prospects

See https://www.techtarget.com/searchdatacenter/tip/Classical-vs-quantum-computing-What-are-the-differences for more information.

The current QC challenge: Scaling error correction

What’s particularly tricky about QC is how to trigger and then accurately measure superposition behavior. To date, QCs have been error prone.  Qubits can be quite sensitive to noise.

In fact, one of the reasons the launch of Google Quantum AI group’s Willow chip in December 2024 (timed for the Q2B event) received so much media attention was the group’s error correction claim: “Willow can reduce errors exponentially as we scale up using more qubits. This cracks a key challenge in quantum error correction that the field has pursued for almost 30 years.”

Other major QC R&D and product teams at the Q2B event echoed the Google group’s desire to scale up the number of qubits in QC systems. There was much discussion of the need for thousands or ideally millions of qubits per system for QC to be able to tackle a wide range of use cases. Google’s Willow chip has 105 qubits.

Much discussion at Q2B focused on logical qubits. A logical qubit abstracts the behavior exhibited by physical qubits to enable fault tolerance. Yuval Borger, chief commercial officer at QuEra, a QC provider with a 256-qubit processor available on the Amazon Braket QC service, alluded to Algorithmic Fault Tolerance as a promising error impact mitigation technique featured in a 2024 Harvard Quantum Optics Lab paper.

QuEra describes the rule of thumb that’s commonly agreed upon in an explainer on its website: “Each logical qubit will require 1,000 physical qubits.” That ratio will be subject to change as QC methods evolve.

AI’s use in quantum computing: An error correction example

Elica Kyoseva, Director Quantum Algorithm Engineering at NVIDIA, elaborated during her Q2B talk on the company’s pivotal role in quantum computing. 

NVIDIA does not itself build quantum processing units (QPUs) or other quantum hardware. Instead it provides the surrounding classical infrastructure, assuming a hybrid classical/quantum approach to accelerated supercomputing.The company is thus QPU agnostic and has many QC partners.

One of the key uses of AI Kyoseva mentioned was in classical computing enabled error correction. AI can discover error-correcting code for specific types of quantum hardware, for one example. Kyoseva made clear that the near-term use cases for hybrid classical/quantum computing involve small data, but highly complex parameters. 

The commercialization challenge over the near term, therefore, is in identifying and then exploiting the small number of use cases that can effectively harness the technology within its current limits. Much of the revenue of QC providers at present is due to large public and private R&D units who want to familiarize themselves with QC and get a development flow going. NVIDIA is well positioned in this respect with its CUDA-Q for hybrid quantum-classical computing platform.

I’ll be able to expand more on how AI methods can help to make quantum computing more generally useful in Part II of this series. 

Leave a Reply

Your email address will not be published. Required fields are marked *