论文标题
关于通过图神经网络表示线性程序
On Representing Linear Programs by Graph Neural Networks
论文作者
论文摘要
学习优化是一个快速增长的领域,旨在使用机器学习(ML)来解决优化问题或改善现有的优化算法。特别是,图形神经网络(GNN)被认为是用于优化问题的合适ML模型,其变量和约束是置换的 - 例如,线性程序(LP)。尽管文献报道了令人鼓舞的数值结果,但本文为将GNNS应用于解决LP的理论基础。给定LPS的任何尺寸限制,我们构造了一个GNN,该GNN将不同的LP映射到不同的输出。我们表明,正确构建的GNN可以可靠地预测广泛类别中每个LP的可行性,界限和最佳解决方案。我们的证明是基于最近发现的Weisfeiler-Lehman同构测试与GNN之间的联系。为了验证我们的结果,我们培训了一个简单的GNN,并提出了将LP映射到其可行性和解决方案中的准确性。
Learning to optimize is a rapidly growing area that aims to solve optimization problems or improve existing optimization algorithms using machine learning (ML). In particular, the graph neural network (GNN) is considered a suitable ML model for optimization problems whose variables and constraints are permutation--invariant, for example, the linear program (LP). While the literature has reported encouraging numerical results, this paper establishes the theoretical foundation of applying GNNs to solving LPs. Given any size limit of LPs, we construct a GNN that maps different LPs to different outputs. We show that properly built GNNs can reliably predict feasibility, boundedness, and an optimal solution for each LP in a broad class. Our proofs are based upon the recently--discovered connections between the Weisfeiler--Lehman isomorphism test and the GNN. To validate our results, we train a simple GNN and present its accuracy in mapping LPs to their feasibilities and solutions.