Co-Authors:
Chris Ballance, Kaitlin Gili, Mohamed Hibat-Allah
Chris Ballance, Kaitlin Gili, Mohamed Hibat-Allah
With the surge of quantum computers as potential competitors for classical computers, a new area of research called quantum machine learning (QML) has emerged. This research area is geared toward performing machine learning (ML) tasks using quantum computers similar to the use of neural networks on classical computers. Since its infancy, rapid research developments have yielded support for quantum advantage compared to classical methods and significant progress towards using hybrid quantum-classical approaches in industrial-scale datasets.
Despite this progress, very little work has been done to understand generalization in quantum circuits, particularly in the task of generative modeling. The latter aims to find a suitable ML model that can capture the distribution of data from a few samples to generate new, valid, and high-quality samples.
Recently, Gili et al., 2022 developed a new framework for quantifying generalization with a well-defined approach that puts classical, quantum-inspired and quantum generative models on the same ground. To address the knowledge gap in this research area, we used this framework to showcase the first attempt to quantify the generalization capabilities of quantum-circuit-based generative models.
In this work [2], we first demonstrate that Quantum Circuit Born Machines (QCBMs) [3] can learn valid patterns and generate novel samples. Remarkably, with as little as 30% of data of a cardinality-constrained target distribution, the QCBM model reaches already excellent generalization performance on 12-qubit models. We show a systematic improvement in performance upon increasing the number of layers in our QCBMs, thus providing evidence of the interplay between expressivity and generalization capabilities of quantum generative models. Furthermore, we demonstrate that QCBMs can generalize and generate samples with higher quality than samples in the training set. The latter property can be very helpful when attempting to solve combinatorial optimization problems, where the goal of the task is to minimize a given cost function. Using these quantum generative models in the context of combinatorial optimization problems can be plausible within the Generator-Enhanced Optimization framework proposed in Ref. [4]. As Gili et al., 2022demonstratedGili et al., 2022 demonstrated quantum-inspired advantage over some classical models in terms of generalization, we foresee that QCBMs’ generalization capabilities could be an asset in the race toward practical quantum advantage in generative models and in combinatorial optimization.
References: