The Quantum Pod: Yudong Cao on Generative AI and Tensor Networks

The Quantum Pod: Yudong Cao on Generative AI and Tensor Networks

In the latest episode of The Quantum Pod, Zapata CTO Yudong Cao takes the listener on a deep dive into enterprise use cases incorporating generative AI, Large Language Models, synthetic data, tensor networks – and where quantum fits in all this.  

We’ve distilled a few of his answers here (lightly edited for clarity) for those with 5 minutes to spare. But we encourage everyone to listen to the pod to maximize the insights — and there are many! 

Q: Does generative AI have enterprise use cases? 

Yudong:Yes, generative AI does have a broad set of enterprise use cases. The large language models, which are the focus of significant buzz these days, can be used in enterprises for designing new products or performing internal language-based tasks, like reviewing documents or performing some of the functions of database and expert systems.  

But generative AI goes beyond language-based tasks. Of particular interest to many enterprises is the fact that you can generate synthetic data for training machine learning models. My colleagues and I at Zapata have been paying very close attention to this use of generative models – in both research and customer engagements. We’re also addressing the question: can generative AI address hard combinatorial optimization problems and problems where you need to make decisions while managing conflicting constraints? These are typically the kinds of problems enterprises face globally.  

For example, say you want to maximize the amount of shipments that go through a logistics network but, at the same time, you also want to minimize the cost. This includes minimizing the number of trucks that you use to ship things from one place to another and minimizing the warehouse inventory. These are all conflicting goals. And it’s this sort of conflicting constraints that really make these problems mathematically challenging.  

Computer scientists came up with the label of “NP hard” to generically describe how difficult these problems are. But FedEx still needs to deliver their packages, Amazon still needs to be able to route their orders efficiently, and banks need to optimize their portfolios. So, in reality, we need to grapple with the complexity one way or another. And this is where I see generative AI playing a pivotal role.  

The concrete work that Zapata has done in the past few years demonstrates that you can actually combine AI with more traditional forms of optimization and traditional forms of algorithms to enhance their ability to handle these types of complexities. 

Q: Given that today’s quantum computers have limited capabilities, how long will it take for enterprises to actually take advantage of quantum generative modeling techniques?

Yudong: It’s true that quantum computers today are still limited in their computational power. But the good news for enterprises is that they don’t necessarily need to wait for fault-tolerant quantum computers in order to start seeing advantages today.  

The reason I say this is due to quantum-inspired methods. We have concrete research to show that with tensor networks — which are a type of data structure that mimics the behavior of these quantum systems and quantum computers and are implemented on existing classical hardware such as CPUs and GPUs — can unlock a lot of value for generative modeling.  

The good news for enterprises is that they don’t necessarily need to wait for fault-tolerant quantum computers in order to start seeing advantages today.  

Yudong Cao

Q: Can you provide some deeper context about tensor networks? 

Yudong: A tensor is just a high-dimensional array. If you program and can define an array, then you can define it with however many dimensions you like. When you do that, you have created a tensor. Tensors are everywhere — you can create one with a line of code. They are already integrated in many of the large language models, for example.   

So, the tensor itself is just a high-dimensional data structure, nothing special. The hard part is to figure out a clever way of connecting tensors so that it is most effective for representing correlations between variables.  

These techniques were pioneered by physicists to simulate very complex quantum systems and make progress on some of the fundamental problems in in high energy physics. So, they’re quite well known in the domain of quantum physics and quantum chemistry. But they’re not so common in the domain of machine learning. The use of tensor networks for machine learning is a relatively recent trend. 

Because there are such efficient linear algebra structures for representing complex correlations with tensor networks — as opposed to neural networks, which always rely on some sort of non-linearity to achieve their expressive power — I think there’s great potential for tensor networks to enhance existing deep learning solutions. This is especially true for those solutions that require very large neural networks, for instance, the sort of large language models that have gained a lot of popularity today.  

 Everybody talks about using foundation models like GPT-3 and now GPT-4 to do things, but the flip side of this is generative AI applications like ChatGPT are burning millions of dollars to keep up with the demand because every single query to those large language models costs orders of magnitude higher than a single search engine query. That is the problem being swept under the rug. I’m curious to see what GPT-4, with its trillions of parameters, costs. 

We believe that tensor networks represent a promising path to substantially reduce the cost of those large language models

Yudong Cao

That said, there’s a very sharp demand for making these networks smaller. Zapata has done a lot of work in this area, and we believe that tensor networks represent a promising path to substantially reduce the cost of those large language models. And by substantial, I mean orders of magnitude reductions in the sizes of the models, their energy footprints and their inference time — the amount of effort it takes to train them. 

Of course, the devil is in the details, and we can’t say for sure until we’ve done the benchmarking. But there is enough evidence we have seen to be confident that tensor networks have the potential to dramatically improve this current deep learning solutions.  

That’s a primer on tensor networks. I also have an additional comment on how they connect to quantum computing. 

The key connector is linear algebraic structure, because tensor networks essentially share a lot of similarities with quantum circuits. And there’s a sense in which a quantum circuit can be represented by a tensor network. 

We’ve recently worked out a very close connection where you can take a matrix product state, which is specific form of tensor network, and then map it onto a quantum circuit. The quantum computer then improves the expressability, or the richness, of the model. In other words, it’s like a relay race where you start with tensor networks on a classical computer, train the model to a certain point, pass the baton to the quantum computer, and then then the quantum computer trains the model even further.  

it’s like a relay race where you start with tensor networks on a classical computer, train the model to a certain point, pass the baton to the quantum computer, and then then the quantum computer trains the model even further.  

The result is that you can capture greater correlations than you could with classical computers exclusively. So, we see a very smooth transition from classical devices to quantum. And, in the future, as quantum hardware improves, we certainly look forward to additional enhancements over existing linear algebra structures in machine learning. 

———————————–

For more insights on generative AI, optimization, AI training, Yudong’s home office set-up (not voice-activated – yet) and more, check out the podcast. 

To learn more about how your organization can take advantage of generative AI, tensor networks, quantum and other cutting-edge technology, contact us.