As a quantum research scientist with a specialization in AI/ML, I am never far from an algorithm. In fact, algorithms play a huge part in what I do every day. This is especially true in the nascent – but growing – field of quantum computing.
This actually isn’t surprising since quantum algorithms are the primary tools for solving mathematical problems via a quantum computer. What is less widely discussed about quantum algorithms, however, is their ability to drive an entire computational workflow that incorporates real data, outputs and dashboards that enterprise decision-makers need.
Workflows in the quantum/classical computing sense, though less widely written about and discussed than algorithms, are a big deal when it comes to making use of quantum computing today and preparing for the future of more powerful machines. You could even call them critically important to quantum computing’s future usefulness. That’s why I think that a mindset change in the industry — from algorithms to workflows – is in order.
The classical computing piece of the workflow puzzle is somewhat underrated in its significance from what I’ve seen and heard. It’s important to keep in mind that quantum computing as it exists today is relatively powerful for certain use cases — such as optimization, ML and simulation problems – but it does not operate in a vacuum. In fact, it currently relies on a great deal of classical computing working with it, and likely always will.
As my colleague Christopher Savoie recently wrote on the topic of classical computing’s role in any quantum computing framework, “…quantum computing requires an immense amount of classical work that goes unnoticed even though its presence looms large, like the proverbial ‘elephant in the room’ that nobody acknowledges.”
There’s an analogy I often use to explain the dynamic between classical and quantum computing. Imagine a classical computer as a bus and a quantum computer as an airplane. For most of the time, the airplane behaves like a bus, loading and unloading the passengers, driving to the tarmac, etc. It only uses the “airplane” part of it when it gets to a bottleneck that a bus can’t solve (e.g., flying from New York to Paris).
In the same way, a quantum algorithm is most of the time a classical algorithm, for loading and outputting the data, processing numbers, performing calculations, etc., and it only uses the “quantum” part when it gets to a bottleneck that a classical computer can’t solve (or that it would take many years solving).
A successful shift from an algorithm to a workflow mindset requires collaboration between many different stakeholders. I find that the most effective workflow architectures help individuals focus on their areas of expertise as part of a larger effort. For example, an algorithm expert, ETL expert for supply chain management, and domain expert in delivery logistics may all contribute to the creation of one quantum-enhanced optimization (QEO) solution. In fact, a group of us here at Zapata Computing worked on one such project in the consumer beverage space.
The reality is that the newest and most exciting quantum algorithm alone is not going to provide a computational speed-up on an architecture that is slowed down by data ingestion types of procedures because the DSML infrastructure and ETL workflow is poorly constructed. A holistic, collaborative method of integrating quantum computing into workflows is going to be increasingly important to those organizations looking to win – or the classical overhead could undo any advantage the quantum steps have produced.
I’m seeing signs of this mindset change more frequently in my conversations with colleagues and enterprise leaders. That’s a good sign, but I can’t emphasize enough how important it is. That’s why I wrote this post! Please share with anyone you know who has an interest or role in solving complex problems with quantum computing.