Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Quantum Computers, Coming to a Data Center Near You

Getting quantum and classical hardware to coexist takes deft engineering

5 min read

Edd Gent is a Contributing Editor for IEEE Spectrum.

yellow discs suspended above each other connected by metal rods

Oxford Quantum Circuits' 32-qubit Toshiko system can be deployed on-site to data centers.

Oxford Quantum Circuits

Quantum processors are expected to massively outperform classical ones on certain problems, but for many computing tasks they offer little advantage. That’s leading to a growing focus on hybrid computing systems that could combine the best of both approaches. But getting such different technologies to work together smoothly is challenging. Quantum hardware struggles in a noisy data center, doesn’t naturally fit in with software architecture, and there is a technical language barrier between quantum and classical computing engineers.

Combining processors that excel at different tasks is a well-established idea. High-performance computing (HPC) frequently relies on a mixture of CPUs and special-purpose accelerators like graphics processing units (GPUs), and many kinds of computers feature dedicated co-processors to tackle problems like signal processing, networking, or encryption. It’s long been recognized that quantum computing is likely to follow a similar path, says Yuval Boger, chief commercial officer at quantum hardware maker QuEra Computing in Boston. That’s because, despite its strengths on certain intractable problems, the technology is too slow and unreliable for a lot of computing tasks.

“Had you asked me two years ago, I would have said you’d be crazy to have an on-premises quantum computer. But it turns out that’s what everyone wants.” —Yuval Boger, QuEra Computing

But while most co-processors are built using the same CMOS technology and operate in fundamentally similar ways, quantum processors rely on a completely different computing paradigm. There are also a dizzying array of hardware choices, including superconducting circuits, ion-traps, and neutral atoms. Efforts to merge these technologies with classical computers have mainly been underway in the laboratories of quantum computing companies. But lately, says Boger, customers increasingly want to integrate quantum processors into their own data centers or HPC facilities, opening up new engineering challenges.

“Had you asked me two years ago, I would have said you’d be crazy to have an on-premises quantum computer,” he says. “But it turns out that’s what everyone wants.”

“It’s integrating a new beast into the data center.” —Yuval Boger, QuEra Computing

So far, most quantum hardware companies have provided cloud access to their devices. But in the last couple of years, major players have announced deals to install their machines at data centers, supercomputing centers and national laboratories. Reasons are varied, says Boger, including concerns around sending sensitive data over the cloud or a desire to kick-start local quantum ecosystems by giving researchers direct access to hardware. But a major consideration is that transmitting data back and forth over the Internet can introduce latencies that significantly slow down hybrid algorithms.

Hardware Hiccups

However, quantum computers are very different from the hardware these facilities normally deal with. “It’s integrating a new beast into the data center,” says Boger. “Both from a physical perspective and also from a software perspective it just takes work.”

The fragile quantum states these devices rely on pose a particular challenge, says Jonathan Burnett, technical director of quantum hardware at Oxford Quantum Circuits (OQC) in the United Kingdom. Early quantum computers were built in laboratories where it was possible to control things like vibration, noise and electromagnetic interference. Data centers, in contrast, are full of loud cooling fans and rogue electromagnetic radiation caused by high power electronics. “You’ve actually got, in many ways, a horrible environment for anything quantum,” says Burnett.

“Our runtime is maybe a second. That really doesn’t fit with this model of supercomputing resource management.” —Jamie Friel, Oxford Quantum Circuits

Quantum computers also look very different from the uniform server cabinets these facilities normally house. Most quantum computing technologies, including the superconducting circuits OQC uses, must operate at cryogenic temperatures. That means they need to be enclosed in large dilution refrigerators that require liquid helium to cool the device close to absolute zero (though QuEra’s neutral atom-based processors use laser cooling and don’t require dilution refrigerators).

Software Stumbling Blocks

Hardware isn’t the only problem. Creating orchestration systems to efficiently share computing workloads between quantum and classical devices requires a lot of software engineering. Integrating into HPC centers presents particular challenges, says Jamie Friel, compiler team manager at OQC. These facilities rely on workload managers to allocate resources, but the software is designed to deal with algorithms that require vast amounts of memory and compute for days at a time. “Our runtime is maybe a second,” he says. “That really doesn’t fit with this model of supercomputing resource management.”

“You have to become more serious about manufacturing, and support, and treating these systems as systems as opposed to science experiments.” —David Rivas, Rigetti

Friel says they’ve created a workaround by putting a software node in front of the quantum computer, which looks like an HPC resource to the workload manager. But behind this is a separate job queue, which schedules quantum computations.

Adding to these engineering challenges is the fact that the physicists and computer scientists who work at quantum computing companies speak a very different language to data center engineers. Burnett says this has required them to reconsider a lot of their documentation and how they think and talk about their devices.

This is the main challenge for quantum hardware providers as they try to integrate with the broader computing ecosystem, says David Rivas, chief technology officer of Rigetti, in Berkley, Calif., which sells superconducting quantum computers. “You have to become more serious about manufacturing, and support, and treating these systems as systems as opposed to science experiments.”

Language Barrier

One unavoidable problem is that quantum computers are analog devices. That means their parameters tend to drift and they have to be regularly re-tuned, something data center engineers aren’t used to doing. Quantum computing companies also have to create operating systems and software tools that abstract away complicated analog hardware details, and present the hybrid system as a single coherent device.

“It can look at the very highest level much like writing classical computing environments,” Rivas says. “But this isn’t Linux running on a single processor or collection of cores. This is a distributed environment of communicating processes integrating these two disparate kinds of technologies.”

Ultimately though, as quantum computers become more powerful, Rivas suspects customers will increasingly want lower level access to squeeze out extra performance. And due to the idiosyncratic nature of each company’s quantum hardware they will have to create bespoke tooling to support such customers.

Further into the future, Rivas suspects the integration between classical and quantum hardware will deepen considerably. Existing quantum computers are already essentially hybrid devices, as the quantum processor is hooked up to a classical control system. He thinks eventually this control system will effectively merge with the HPC system.

“The next logical step is to build a control system for a quantum computer that looks an awful lot like an HPC blade [server], and then insert that into a node,” says Rivas. “Now you have a quantum enabled node for HPC.”

“We have to build quantum computing systems, not just qubits.” —David Rivas, Rigetti

This would allow the quantum processor to communicate with classical resources over very high speed connections, potentially speeding up hybrid algorithms. But just as importantly, it would provide the control system with access to HPC resources. Rivas says that could be important for carrying out error-correction on larger quantum systems, which will require heavy classical computations to be carried out very quickly.

Integrating their devices into classical computing environments is also helping clarify things for quantum computing companies, says Rivas. Much of their focus has, quite rightly, been on designing the best qubits, and things like software architectures and connectivity issues have often been neglected. But that’s starting to change, says Rivas.

“All this stuff that we’re talking about has to become part of the underlying DNA of the company,” he adds. “We have to build quantum computing systems, not just qubits.”

The Conversation (0)