DARPA Quantum Benchmarking: Three Ways Quantum Transforms Business Outcomes in Unexpected Ways
DARPA Quantum Benchmarking: Three Ways Quantum Transforms Business Outcomes in Unexpected Ways
Earlier this year, we shared some initial findings from Phase II of the DARPA (Defense Advanced Research Projects Agency) Quantum Benchmarking program. More specifically, we published three research pre-prints alongside our partners that are among the first attempts to provide data-driven utility estimates with rigorous quantum resource estimates. These are essential as the potential transformations from AI and quantum computing accelerate over the near term.
Quantum resource estimates are notoriously difficult to get right. That’s why collaborative research is especially important: it brings together stakeholders to establish benchmarks and shape thinking as the nascent quantum computing space accelerates.
Our DARPA Quantum Benchmarking research represents rigorous, scientific, and state-of-the-art estimates of both the economic value and resources required to run several high-utility use cases for quantum computing, each worth billions of dollars: manufacturing ammonia more efficiently, designing ships with less drag, and potentially finding a room-temperature superconductor.
In this article, we’ll explore each use case to highlight the potential economic value at stake and share our estimates for the resources required to unlock that value. The goal of the research is to help inform decision-makers in government, academia, and enterprise about where to invest their research dollars to maximize the impact of quantum computing.
While more work is needed to improve both quantum hardware and algorithms to make these use cases a reality, this is an exciting foundation for the transformational potential of enterprise quantum computing and AI.
Industrial chemical manufacturing consumes a significant amount of energy and raw materials. One prominent example is ammonia production, which is widely used in fertilizers, pharmaceuticals, and explosives. Ammonia production represents a colossal industry, with around 170 million metric tons produced annually valued over $200 billion USD in 2023.
Fault-tolerant quantum computers could significantly reduce the time required for calculations involved in catalyst discovery – this advantage alone is worth hundreds of thousands of dollars per reaction.
To create ammonia, atmospheric nitrogen must be transformed, or “fixed”, into its usable form. Today, this is typically done using the Haber-Bosch process. This energy-intensive process relies on natural gas, which causes the price of ammonia to fluctuate wildly with the price of natural gas. The Haber-Bosch process is also deeply inefficient, responsible for approximately 1.8% of global carbon emissions while achieving only a 10-20% yield.
Thus, given the global importance of ammonia, there is a pressing need for a more efficient approach to produce ammonia more sustainably and at a lower cost. To this end, the research team explored how quantum computing could support the creation of more effective catalysts for nitrogen fixation.
Specifically, the research looked at homogenous catalysts, which offer several advantages over the heterogenous catalysts typically used for producing ammonia. Homogeneous catalysts work under much lower temperatures compared to the extremes required for Haber-Bosch, for example. This reduces the cost of production and increases overall efficiency.
In terms of economic utility, the research found that solving just one reaction with a classical computer costs as much as $200,000 in compute costs, which could potentially be reduced significantly using a quantum computer. Beyond compute costs, there are billions of dollars at stake in producing ammonia and other chemicals more efficiently at scale with these new catalysts.
Furthermore, the research estimated that quantum computers would take 139,000 QPU-hours to solve such reactions, compared to 400,000 CPU hours for equivalent state-of-the-art classical methods, though many improvements are expected for quantum methods to bring that compute time down even further.
The results point to a promising future for fault-tolerant quantum computers to accelerate the discovery of homogenous catalysts. While the research focused on nitrogen fixation, the techniques they explored could also be applied to discover other industrially relevant homogenous catalysts – and even more cost-efficient outcomes.
For a more detailed breakdown of the research, created in collaboration with the University of Toronto, view the full paper here.
Across many industries, traditional design and engineering workflows are being upgraded to simulation-driven processes. One example is designing the massive cargo ships that form the bedrock of global trade.
$1B in annual value is at stake for using quantum computers to accelerate computational fluid dynamics (CFD) across all industries, and $10-100M in annual value at stake for CFD in ship design specifically. However, more research is needed to make quantum computing practical for this use case.
Computational fluid dynamics (CFD) are used to simulate fluid flows, like water or air, to estimate the drag force on the vehicle. Designing ships to minimize drag could significantly reduce fuel costs and thus shipping costs.. Given the complexity of simulating turbulent flow, CFD calculations are notorious for high compute costs. Since simulating these turbulent flows would take an impractically long time to compute, approximations are typically used — at the expense of accuracy.
Quantum computers, on the other hand, can simulate turbulent flows efficiently. Thissignificantly reduces design workflow costs by lowering computational costs and eliminating the need for experimental testing.
In the research paper sponsored by DARPA, the team found that the potential value at stake in accelerating CFD was between $10-100 million for ship design alone, and around $1 billion for the total CFD market.
Can quantum computers actually be useful here?
Compared to areas like quantum chemistry, quantum computing for CFD is a relatively unexplored area. So, the research team developed a new quantum approach for estimating drag force. Unfortunately, the resources required to run this pioneering algorithm on a fault-tolerant quantum computer were found to be impractical in its current form.
However, other applications in quantum chemistry have seen orders-of-magnitude improvements in quantum resource requirements as they matured, and with more research the same could be true for CFD.
In fact, the paper identifies several avenues for reducing these resource costs. The bottom line is that more research is needed to realize the potential value at stake for this quantum use case.
The full research paper, produced in collaboration with L3Harris, can be found here.
The Fermi-Hubbard model is one of the most studied models in condensed matter physics, used to describe the behavior of electrons in a lattice. It is especially important for understanding the complex phenomena in strongly correlated electron systems, such as superconductors, magnets for MRI machines, batteries, and simulating the properties of new materials.
If quantum computing could enable an efficient and exact solver for the Fermi-Hubbard model, it could potentially generate an estimated $22.1B in present day value. However, further advances in quantum algorithms and hardware are necessary to realize this potential value.
However, given the complexity of strongly correlated systems, “solving” the model is challenging and imprecise with classical computers. The Department of Energy estimates it spends $586 million per year on operating costs for high performance computing facilities, of which we estimate approximately 20% corresponds to modeling Fermi-Hubbard type systems, or around $117 million annually.
If just 1% of these resources could be rededicated to another problem due to the existence of an efficient, fast, and exact Fermi-Hubbard solver, it could save an estimated $8 million, 0.5 GWh, and 1.9 metric tons of CO2. By increasing research productivity, we estimate a high-accuracy solver could save an additional $4.4 million.
Those economic impact estimates don’t even consider the potential second order impacts of a fast and accurate Fermi-Hubbard solver. Having such a solver could lead to more efficient and powerful MRI devices, more energy-dense batteries for electric vehicles and renewable energy storage, and perhaps most significantly of all, a room-temperature superconductor.
A room-temperature superconductor could dramatically reduce energy costs, eliminating the 5% of energy that is currently wasted as heat due to resistance in transmission lines (although the costs of installing superconducting cable would eat into this value). It would also be useful in enabling fusion power, maglev trains, and a wide range of other electrical applications.
The research estimated that if a fast and accurate Fermi-Hubbard solver led to the development of a room-temperature superconductor, it would be worth an average of $22.1 billion in present value.
Setting aside the enormous potential value at stake, we make the case in the research that the Fermi-Hubbard model could serve as a scalable benchmark of quantum computing performance. In doing so, we performed a first-of-its-kind detailed resource breakdown for the task of estimating experimentally relevant outputs for the Fermi-Hubbard model.
The resource estimates point to a clear need for improvement along multiple axes to make the promising use cases detailed here a reality with quantum computers. However, this is precisely why scalable, extensible benchmarks are needed.
To make the case that a particular computational tool is going to make a difference, we should have a clear view of the costs associated, and the potential benefits it may have. The Fermi-Hubbard model provides a clean platform to lay out this case.
This research was conducted in collaboration with North Carolina State University, Rigetti, Lockheed Martin, and the MIT Lincoln Laboratory. For more details, the full research paper can be found here.
Our first-of-its-kind utility estimates as part of the DARPA Quantum Benchmarking have re-affirmed the transformative potential of quantum computing. With billions of dollars at stake across only these three use cases, the upside potential is clear. As the evolution of the technology accelerates, we’ll unearth new opportunities to optimize processes, reduce costs and improve results across a wide variety of industries.
Of course, there’s still considerable research and innovation in both quantum hardware and algorithms that needs to happen to realize this value. The resource estimates we have conducted here will provide the research community with an invaluable baseline for improving the methodologies we have today.
The BenchQ resource estimation platform we used for some portions of this research is open-source and available to the public. More details on BenchQ can be found in our previous blog post about our work with DARPA. If you’d like to use BenchQ for your own quantum resource estimation research, simple instructions for installing the package can be found here.
In addition, we leveraged the Zapata AI Orquestra® platform to run large-scale quantum chemistry calculations and organize quantum resource estimation experiment tracking. The open-source python SDK for authoring such workflows can be found here.