IBM wants its quantum supercomputers running at 4,000-plus qubits by 2025
Forty years after it first began to dabble in quantum computing, IBM is ready to expand the technology out of the lab and into more practical applications — like supercomputing! The company has already hit a number of development milestones since it released its previous quantum roadmap in 2020, including the 127-qubit Eagle processor that uses quantum circuits and the Qiskit Runtime API. IBM announced on Wednesday that it plans to further scale its quantum ambitions and has revised the 2020 roadmap with an even loftier goal of operating a 4,000-qubit system by 2025.
Before it sets about building the biggest quantum computer to date, IBM plans release its 433-qubit Osprey chip later this year and migrate the Qiskit Runtime to the cloud in 2023, “bringing a serverless approach into the core quantum software stack,” per Wednesday’s release. Those products will be followed later that year by Condor, a quantum chip IBM is billing as “the world’s first universal quantum processor with over 1,000 qubits.”
This rapid four-fold jump in quantum volume (the number of qubits packed into a processor) will enable users to run increasingly longer quantum circuits, while increasing the processing speed — measured in CLOPS (circuit layer operations per second) — from a maximum of 2,900 OPS to over 10,000. Then it’s just a simple matter of quadrupaling that capacity in the span of less than 24 months.
To do so, IBM plans to first get sets of multiple processors to communicate with one another both in parallel and in series. This should help develop better error mitigation schemes and improve coordination between processors, both necessary components of tomorrow’s practical quantum computers. After that, IBM will design and deploy chip-level couplers, which “will closely connect multiple chips together to effectively form a single and larger processor,” according to the company, then build quantum communication links to connect those larger multi-processors together into even bigger clusters — essentially daisy-chaining increasingly larger clumps of processors together until they form a functional, modular 4,000-qubit computing platform.
“As quantum computing matures, we’re starting to see ourselves as more than quantum hardware,” IBM researcher Jay Gambetta wrote on Wednesday. “We’re building the next generation of computing. In order to benefit from our world-leading hardware, we need to develop the software and infrastructure capable of taking advantage of it.”
As such, IBM released a set of ready-made primitive programs earlier this year, “pre-built programs that allows developers easy access to the outputs of quantum computations without requiring intricate understanding of the hardware,” per the company. IBM intends to expand that program set in 2023, enabling developers to run them on parallelized quantum processors. “We also plan to enhance primitive performance with low-level compilation and post-processing methods, like introducing error suppression and mitigation tools,” Gambetta said. “These advanced primitives will allow algorithm developers to use Qiskit Runtime services as an API for incorporating quantum circuits and classical routines to build quantum workflows.”
These workflows will take a given problem, break it down into smaller quantum and classical programs, chew through those processes in either parallel or series depending on which is more efficient, and then use an orchestration layer to “circuit stitch” all those various data streams back into a coherent result that classical computers can understand. IBM calls its proprietary stitching infrastructure Quantum Serverless and, per the new roadmap, will deploy the feature to its core quantum software stack in 2023.
“We think by next year, we’ll begin prototyping quantum software applications for users hoping to use Qiskit Runtime and Quantum Serverless to address specific use cases,” Gambetta said. We’ll begin to define these services with our first test case — machine learning — working with partners to accelerate the path toward useful quantum software applications. By 2025, we think model developers will be able to explore quantum applications in machine learning, optimization, finance, natural sciences, and beyond.”
“For many years, CPU-centric supercomputers were society’s processing workhorse, with IBM serving as a key developer of these systems,” he continued. “In the last few years, we’ve seen the emergence of AI-centric supercomputers, where CPUs and GPUs work together in giant systems to tackle AI-heavy workloads. Now, IBM is ushering in the age of the quantum-centric supercomputer, where quantum resources — QPUs — will be woven together with CPUs and GPUs into a compute fabric. We think that the quantum-centric supercomputer will serve as an essential technology for those solving the toughest problems, those doing the most ground-breaking research, and those developing the most cutting-edge technology.”
Together, these hardware and software systems will become IBM Quantum System Two with the first prototype scheduled to be operational at some point next year.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.