论文标题

地震成像的深度先验

Weak deep priors for seismic imaging

论文作者

Siahkoohi, Ali, Rizzuti, Gabrio, Herrmann, Felix J.

论文摘要

由于解决方案和数据噪声的非唯一性,应处理不明问题的逆问题时,必须将有关模型未知数的先验知识纳入不知情的知识。不幸的是,以方便和分析的方式充分描述我们的先验并不是很微不足道的。用卷积神经网络(CNN)对未知数进行参数化,并在其权重上假设没有信息的高斯先验,这会导致输出空间上有偏爱“自然”图像并排除嘈杂的文物的变异先验,只要阻止过度拟合。这是所谓的深点方法。但是,在地震成像中,评估前向操作员的计算昂贵,并且训练随机初始化的CNN变得不可行。相反,我们提出了一个弱版的深度先验版本,它包括放宽反射率模型必须位于网络范围内的要求,并让未知数根据高斯分布偏离网络输出。最后,我们共同解决反射率模型和CNN权重。这种方法的主要优点是,CNN权重的更新不涉及建模操作员,并且变得相对便宜。我们的合成数值实验表明,弱的深度先验比常规最小二乘成像方法更强大,大约是反向迁移的计算成本的两倍,这是大规模成像问题中负担得起的计算预算。

Incorporating prior knowledge on model unknowns of interest is essential when dealing with ill-posed inverse problems due to the nonuniqueness of the solution and data noise. Unfortunately, it is not trivial to fully describe our priors in a convenient and analytical way. Parameterizing the unknowns with a convolutional neural network (CNN), and assuming an uninformative Gaussian prior on its weights, leads to a variational prior on the output space that favors "natural" images and excludes noisy artifacts, as long as overfitting is prevented. This is the so-called deep-prior approach. In seismic imaging, however, evaluating the forward operator is computationally expensive, and training a randomly initialized CNN becomes infeasible. We propose, instead, a weak version of deep priors, which consists of relaxing the requirement that reflectivity models must lie in the network range, and letting the unknowns deviate from the network output according to a Gaussian distribution. Finally, we jointly solve for the reflectivity model and CNN weights. The chief advantage of this approach is that the updates for the CNN weights do not involve the modeling operator, and become relatively cheap. Our synthetic numerical experiments demonstrate that the weak deep prior is more robust with respect to noise than conventional least-squares imaging approaches, with roughly twice the computational cost of reverse-time migration, which is the affordable computational budget in large-scale imaging problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源