论文标题

到量子或不进行量子:在近期量子优化中选择算法

To quantum or not to quantum: towards algorithm selection in near-term quantum optimization

论文作者

Moussa, Charles, Calandra, Henri, Dunjko, Vedran

论文摘要

量子近似优化算法(QAOA)构成了经常提到的候选人之一,预计将在近期量子计算的时代产生量子提升。在实践中,量子优化将必须与更便宜的经典启发式方法竞争,这些方法具有数十年的经验领域特异性增强功能。因此,为了实现最佳性能,我们将面临算法选择的问题,该问题在实用计算方面进行了良好研究。在这里,我们将此问题介绍给量子优化域。 具体而言,我们研究了检测到QAOA最有可能在何处产生优势比常规算法的问题的问题。作为我们的案例研究,我们将QAOA与Goemans和Williamson(GW)(最大切割问题)的近似近似算法进行了比较。正如确切预测算法性能的准确性可能是棘手的,我们利用机器学习来确定何时诉诸量子算法。我们实现了超过96 \%的交叉验证精度,这将带来实质性的实际优势。在此过程中,我们重点介绍了一些实例的功能,使它们更适合QAOA。当我们使用模拟理想化的算法时,我们采用的ML方法的灵活性提供了信心,即我们的方法将同样适用于更广泛的经典启发式方法,以及在现实世界中嘈杂设备上运行的QAOA。

The Quantum Approximate Optimization Algorithm (QAOA) constitutes one of the often mentioned candidates expected to yield a quantum boost in the era of near-term quantum computing. In practice, quantum optimization will have to compete with cheaper classical heuristic methods, which have the advantage of decades of empirical domain-specific enhancements. Consequently, to achieve optimal performance we will face the issue of algorithm selection, well-studied in practical computing. Here we introduce this problem to the quantum optimization domain. Specifically, we study the problem of detecting those problem instances of where QAOA is most likely to yield an advantage over a conventional algorithm. As our case study, we compare QAOA against the well-understood approximation algorithm of Goemans and Williamson (GW) on the Max-Cut problem. As exactly predicting the performance of algorithms can be intractable, we utilize machine learning to identify when to resort to the quantum algorithm. We achieve cross-validated accuracy well over 96\%, which would yield a substantial practical advantage. In the process, we highlight a number of features of instances rendering them better suited for QAOA. While we work with simulated idealised algorithms, the flexibility of ML methods we employed provides confidence that our methods will be equally applicable to broader classes of classical heuristics, and to QAOA running on real-world noisy devices.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源