8 Considerations For Your Quantum Toolset
8 Considerations For Your Quantum Toolset
If the framework or tool claims to be the only framework or full-stack solution you need to use, ask yourself, “Should I really be locked into this framework?” At Zapata, our point of view is it is far too early to make a choice and you need a toolset that remains flexible over time. Doubling down on a framework could mean your team risks getting locked into that vendor’s roadmap and/or full-stack solution. Zapata members post daily on our Slack about exciting new libraries and publications. Remaining compatible with the most recent libraries and tools built by companies and the Open Source community will be important for years to come.
Similar to question #1, make sure the framework you are looking at is compatible with other toolsets you’ve used and built-up. Compatibility minimizes switching costs like rewriting code and re-creating processes in a new format. It also means that teammates building circuits and code on different platforms can collaborate and share seamlessly. For example, a team member working in Qiskit and another working in Cirq should be able to collaborate on the same workflow, also incorporating additional quantum or domain-specific libraries.
As hardware and software continue to advance, will the framework advance with it? Could you end up stuck with code tightly coupled to the wrong hardware or software toolset? Workflows written today should not need to be rewritten, but instead, allow you to swap in new elements as the technology matures. In other words, you should be able to “upgrade the machinery” as quantum devices perform better over time. Expect that the best tools and algorithms for your use case are yet to come, and aim to be poised to take advantage.
Make sure when you pick a quantum toolset that it interfaces with the best hardware today so you can experiment with the top-of-the-line quantum machines. In addition, make sure you are allowed access down to “bare metal.” Zapata’s scientists often find the best performance from hardware-smart algorithms that leverage the strengths of a device’s specific architecture. High-level toolsets can offer a great service by abstracting away complexity. The best, however, don’t require this abstraction when you need to get under the hood.
The pursuit of real business problems means using real data to go beyond a small proof of concept. If the toolset can only be accessed from the cloud, will it comply with your internal security standards? If it can be hosted in your own environment (either on-premises or in your own cloud) then you will have the control you need to meet your organization’s security and performance constraints.
Because quantum experiments can require a significant amount of computational resources, make sure your quantum toolset can scale to match your problem size. As problems scale beyond the “toy” stage, experiments can take hours or days to complete. Interactive tools like Jupyter notebooks, though fantastic for many data science applications, can become burdensome in this regime. You can see exactly why in this video. If you can’t easily run different tasks asynchronously, in parallel, or on large scale HPC hardware, understand that it may mean lost time and resources for your team. A toolset that includes tools for parallelization and automatic scaling anticipates these growing pains.
Quantum experiments are notorious for generating large datasets. If the toolset you use doesn’t help you handle data storage and management, plan to find or engineer tools to store your data properly and support your analytics process. At Zapata, we’ve learned that getting experiments to run successfully is just the first step. Our customers appreciate Orquestra’s ability to develop and manage large quantum datasets. The conclusions in our recent paper on quantum estimation were supported by the execution of over 10,000 containers and over a million lines of resulting data.
Quantum Advantage is the big payout. We are striving for useful applications for quantum computers that outperform classical computers. The toolset you choose should be pushing toward that goal right along with you. Be sure it is built for a range of use cases and there is no limit on what you can pull in from libraries and frameworks that may give you a head start. Investigate whether there are proprietary or pre-built implementations of NISQ algorithms which could provide a significant head start in attacking nearer-term use cases. For instance, our z-quantum-qcbm resource has a pre-built Quantum Circuit Born Machine (QCBM) model you can use as a springboard.
As mentioned, we built Orquestra to support our own customer work because we could not find a set of tools that was unified, scalable and extensible enough. Here’s how Orquestra was purpose-built to evolve alongside the quantum hardware, software and algorithm ecosystem.
One last time, these questions come from real Orquestra users and enterprise technology decision makers and feedback we’ve received from experts in the field. Orquestra is driven by the wants and needs of Quantum Computing researchers. This is why it is uniquely positioned to help companies accelerate their very own Quantum Revolution.
We’ve done our best to capture our customers’ most frequent questions, but we’re also curious to hear yours, so please reach out on Twitter or here — we’ll respond here as we get them.