What Does It Mean to be Quantum Ready?
What Does It Mean to be Quantum Ready?
It may still be the early days of the quantum computing revolution, but it’s never too early to become quantum–ready. We talk a lot about quantum–readiness at Zapata. We even trademarked the term “Quantum-Ready Applications®” (more on that later). But what does it actually mean for an organization to be quantum–ready?
Quantum readiness is the process of preparing for the widespread disruption that quantum computing will bring. Like AI and other transformational tech, quantum won’t arrive with the flip of a switch — and adapting to quantum will arguably be more complex than AI. There’s an offensive and defensive side to preparing for quantum. On the offensive side, quantum-ready organizations are prepared to reap the competitive advantages and novel use cases that quantum will bring. On the defensive side, quantum-ready organizations are steeling themselves against the post-quantum cybersecurity threats of the near–future.
Some may look at the capabilities of today’s quantum devices and decide they can afford to wait on quantum readiness until there’s a clear case of quantum advantage. We wouldn’t recommend it though. In our first annual survey on enterprise quantum adoption, we found that 69% of global executives have already adopted or plan to adopt quantum computing in the near term. What’s more, 41% expect to achieve a competitive edge over their industry peers with quantum within the next two years. As 74% of respondents agreed, those who fail to adopt quantum now will fall behind.
So, what can your organization do today to become quantum-ready? In our view, there are four key components to quantum readiness: quantum-ready applications, quantum-ready IT architecture, quantum-ready security, and most importantly, quantum-ready people.
At Zapata we talk about quantum-ready applications all the time. Here’s a definition: a quantum-ready application is a deployment-ready application that is ready to run on powerful quantum devices as soon as they become available. Crucially, a quantum-ready application does not require fully fault-tolerant quantum devices to be in production. In the near-term, it may run on classical hardware or classical simulations of quantum hardware. At Zapata, we are also finding generative machine learning applications that can leverage near term quantum devices. But regardless of the best solution today, a quantum-ready application should be forward-compatible to run on increasingly powerful quantum hardware as it matures.
A quantum-ready application does not require fully fault-tolerant quantum devices to be in production.
You can think of a quantum-ready application like a car, with the quantum device analogous to a nitrous oxide tank. You can drive the car without the booster, but as soon as you swap in the nitrous tank, you gain an immediate power boost without needing to rebuild the entire car. By building quantum-ready applications, you’re building a nitrous-ready car without waiting for the nitrous system to become available.
The first step to building a quantum-ready application is to identify use cases where quantum can add value. Quantum will not solve every problem. Rather, it will provide a powerful advantage for specific problems that are intractable for classical computers to solve alone. These problems generally fall into the categories of complex optimization problems, machine learning problems, and simulations of quantum systems.
It may not be immediately apparent which use cases are worth pursuing, so working with external partners (like Zapata) that have been through the use case discovery process before can help identify the best use cases for your organization to pursue in the near-term versus the long-term. Business unit leaders who understand where the bottlenecks are should also be involved in use case discovery.
Once you’ve settled on some use cases with the potential for practical value to be gained from quantum computing, it’s time to implement prototypes, also known as pilot applications. Here, you’re putting all the pieces together and iteratively testing the application prototype on problem instances of practical value. After fine-tuning the application over many runs on real hardware, prototypes need to be scaled for implementation and deployment in a production environment for customer use.
Zapata’s Orquestra® software platform was explicitly designed for building quantum-ready applications, supporting the entire application lifecycle from research to development to deployment. The modular “plug-and-play” workflow framework makes it easy to swap and test different data sources, algorithms, and hardware backends as you fine tune the application on the path to deployment. You can run your applications on quantum-inspired and classical hardware today and swap in more powerful quantum devices tomorrow.
Run your applications on quantum-inspired and classical hardware today and swap in more powerful quantum devices tomorrow.
Orquestra also enables resource estimates, helping to answer key questions such as when will quantum devices be mature enough for your application? What will the impact be for your business? How is it going to work? How much is it going to cost, and will you get an ROI? This last one is important, because it might be the case that a certain quantum application just isn’t worth it (yet). We are currently under contract with DARPA (the R&D branch of the U.S. Government Department of Defense) to perform resource estimates to quantify the long-term utility of quantum computers using Orquestra.
Algorithms may be the first thing that comes to mind when you think of quantum-ready applications, but they’re just one part of the equation. To fully reap the advantages of quantum computing, you need to have the right computational architecture in place, too.
All the same bottlenecks and challenges found in classical machine learning problems will appear in quantum computing applications. Chief among these are basic data storage and data cleaning tasks. It might not be glamorous, but any potential speedup gained from quantum computing can easily be negated by slow and inefficient data flows. For some of our customers, we were able to unlock a computational speed-up just by streamlining key steps in their data pipeline.
Quantum computing won’t exist in a vacuum; it will always work in tandem with classical resources — public and private cloud, HPC, and edge computing. To bring back the car analogy, these classical resources can be thought of as the wheels, transmission, brakes, alternator, suspension, and all the rest of the parts other than the engine that make the car run. Building a quantum-ready architecture means having all the necessary classical components in your application workflow before you add in the quantum components and being prepared to integrate quantum devices with your existing IT stack.
You’ll also want to consider how to take the outputs of your quantum algorithms and turn them into insights that can readily drive business decisions. In most cases, this will involve creating some kind of dashboard or data visualization that end users can easily interpret to inform their decisions.
Finally, like any data analytics tool, quantum applications will require continuous monitoring and maintenance. You’ll want to empower the right people with the tools to track drifts in the models, monitor hardware health, and maintain the application over time. Quantum applications will require general application monitoring tools, but also specialty tools that work with the unique needs of quantum devices, such as pausing for recalibration or managing a queue.
A quantum-ready architecture means having all the necessary classical components in your application workflow before you add in the quantum components
Even if you never build a quantum application, you will still need to deal with cybersecurity threats associated with the quantum future.
The most well-known threat is Shor’s algorithm, which given access to a large enough fault-tolerant quantum computer, could decrypt a 2048-bit RSA encryption key in just 8 hours, thus undermining the encryption that secures most of the world’s sensitive data and computer systems.
Shor’s algorithm is expected to require a quantum computer with tens of millions of qubits, which isn’t expected to arrive within the next decade. However, that doesn’t mean there’s no urgency to prepare for this future. For one, hackers are already stealing encrypted data with the goal of decrypting it in the future with a sufficiently powerful quantum computer. Secondly, Shor’s algorithm isn’t the only threat.
In 2018, Zapata researchers developed Variational Quantum Factoring (VQF), a heuristic algorithm that could in theory decrypt some 2048-bit RSA keys with only 6,000 high-quality physical qubits. Unlike Shor’s algorithm, heuristic algorithms like VQF do not have mathematical proofs of their efficacy, but they can still break encryption in some cases — using near-term NISQ devices. New heuristic factoring algorithms have since been invented. An algorithm proposed in one recent paper may even compromise a family of post-quantum algorithms currently being considered as potential new standards to replace RSA.
With this looming threat in mind, what can organizations do to develop a quantum-ready security posture? The first step would be assessment: identify your organization’s most sensitive data and applications so they can be prioritized for re-encryption once quantum-safe standards are developed. Build an inventory of metadata for applications using cryptography. You will also want to audit the encryption protocols currently in use across your entire digital infrastructure to assess your exposure to risk. Given the scale of this task, it will likely require years of work even if some automation is applied, so it’s best to start as soon as possible.
The ultimate goal is crypto-agility. New encryption schemes may be compromised in the future, so you want to develop the capability to quickly switch the encryption schemes protecting your most sensitive assets. If you’re going to go through the arduous process of auditing and re-encrypting your entire digital landscape, you only want to do it once. Crypto-agility should become a core DevSecOps process across your organization.
Remember, crypto-agility is an ongoing process, not an achievement. For more on how you can prepare for the post-quantum cybersecurity landscape, visit our post-quantum cybersecurity resource page to learn more about our post-quantum cybersecurity approach.
Quantum readiness starts and ends with people.
Quantum readiness starts and ends with people. To make the most of quantum computing, you’ll need to assemble a multi-disciplinary team that can build, maintain, and operate quantum applications. Of course, this will include software engineers, quantum scientists, and data scientists. But your quantum team should also include business leaders who can tie the insights gained from quantum applications to revenue.
Without buy-in from the highest levels of the organization, a quantum R&D project is likely to remain exactly that — an R&D project. That’s why it’s important to have an executive champion: somebody who understands the value to be gained with quantum computing, and who has the leverage across the organization to scale an application from a proof-of-concept into a practical business solution. Without executive commitment, would-be quantum applications are likely doomed to die in the lab.
Building your quantum team will no doubt include a combination of upskilling existing talent, hiring new quantum talent, and collaborating with external consultants, technology vendors, and professional services teams. Since quantum expertise isn’t exactly plentiful, many organizations will likely need to work with these external partners. For example, at Zapata we help customers stand up Quantum Centers of Excellence and accelerate their capabilities.
There is some urgency here: given the nascence of the field, the talent pool for quantum computing is relatively small and getting smaller all the time. Our survey found that 51% of enterprises on the path to quantum adoption have already started building their quantum teams. If you wait too long, the most talented quantum scientists and engineers could be working for your competitor.
Once you have your quantum-ready team, give them the tools to collaborate. Orquestra is designed to accelerate collaboration between cross-functional teams, from scientists and developers to IT and domain experts. It’s where Zapata’s own scientists and engineers collaborate to build solutions with customers and partners. Using Orquestra’s modular workflow architecture, different groups can work in parallel on different parts of the application, from data ingest to models to dashboards.
In the end, quantum readiness all comes down to your people. All the components of quantum readiness that we have discussed here — teams, applications, architecture, and security — ultimately depend on the minds of your people and the tools those people use. Prioritize building your team and empowering them with the tools they need to succeed, and before you know it, you too can be quantum-ready.
Sharing our understanding of the current state of AI and quantum.