论文标题
可扩展的物理信息神经和图形网络的可扩展算法
Scalable algorithms for physics-informed neural and graph networks
论文作者
论文摘要
物理知识的机器学习(PIML)已成为一种有希望的新方法,用于模拟复杂的物理和生物系统,这些方法受复杂的多尺度过程管辖的复杂物理和生物系统,这些过程也可用。在某些情况下,目的是从可用数据中发现一部分隐藏物理,并且PIML已被证明对于传统方法可能失败的此类问题特别有效。与商业机器学习不同,对深度神经网络的培训需要大数据,在PIML中,大数据不可用。取而代之的是,我们可以通过采用物理定律从获得的其他信息中训练此类网络,并在时空域中的随机点进行评估。这样的物理知识的机器学习将多模态性和多重级数据与数学模型集成在一起,并使用神经网络或图形网络实现它们。在这里,我们使用物理知识的神经网络(PINN)回顾了将物理学嵌入到机器学习中的一些流行趋势,主要基于前馈神经网络和自动分化。对于更复杂的系统或系统和非结构化数据的系统,图形神经网络(GNN)提出了一些不同的优势,在这里我们回顾如何使用基于图表外观计算的GNN来完成物理知识学习,以构建差异操作员;我们将这些体系结构称为物理信息的图形网络(PIGNS)。我们为前进和反问题提供了代表性的例子,并讨论需要哪些进步来扩展Pinn,Pigns和更广泛的GNNS,以解决大规模工程问题。
Physics-informed machine learning (PIML) has emerged as a promising new approach for simulating complex physical and biological systems that are governed by complex multiscale processes for which some data are also available. In some instances, the objective is to discover part of the hidden physics from the available data, and PIML has been shown to be particularly effective for such problems for which conventional methods may fail. Unlike commercial machine learning where training of deep neural networks requires big data, in PIML big data are not available. Instead, we can train such networks from additional information obtained by employing the physical laws and evaluating them at random points in the space-time domain. Such physics-informed machine learning integrates multimodality and multifidelity data with mathematical models, and implements them using neural networks or graph networks. Here, we review some of the prevailing trends in embedding physics into machine learning, using physics-informed neural networks (PINNs) based primarily on feed-forward neural networks and automatic differentiation. For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs). We present representative examples for both forward and inverse problems and discuss what advances are needed to scale up PINNs, PIGNs and more broadly GNNs for large-scale engineering problems.