论文标题

神经网络的贝叶斯学习:算法调查

Bayesian Learning for Neural Networks: an algorithmic survey

论文作者

Magris, Martin, Iosifidis, Alexandros

论文摘要

过去十年目睹了对贝叶斯学习的日益兴趣。然而,除了将理论转化为实际实现的复杂性外,该主题的技术性以及其中涉及的多种成分,还限制了贝叶斯学习范式的使用,从而阻止了其在不同领域和应用程序中的广泛采用。这项独立的调查使读者介绍了神经网络的贝叶斯学习原理和算法。它从可访问的实用算法的角度从主题中介绍了主题。在提供贝叶斯神经网络的一般介绍后,我们讨论并提出了贝叶斯推论的标准和最新方法,重点是依靠变化推断和使用自然梯度的解决方案。我们还讨论了将多种优化作为贝叶斯学习的最新方法的使用。我们研究了所有讨论的方法的特征性能,并为其实施提供伪代码,并注意实际方面,例如梯度的计算。

The last decade witnessed a growing interest in Bayesian learning. Yet, the technicality of the topic and the multitude of ingredients involved therein, besides the complexity of turning theory into practical implementations, limit the use of the Bayesian learning paradigm, preventing its widespread adoption across different fields and applications. This self-contained survey engages and introduces readers to the principles and algorithms of Bayesian Learning for Neural Networks. It provides an introduction to the topic from an accessible, practical-algorithmic perspective. Upon providing a general introduction to Bayesian Neural Networks, we discuss and present both standard and recent approaches for Bayesian inference, with an emphasis on solutions relying on Variational Inference and the use of Natural gradients. We also discuss the use of manifold optimization as a state-of-the-art approach to Bayesian learning. We examine the characteristic properties of all the discussed methods, and provide pseudo-codes for their implementation, paying attention to practical aspects, such as the computation of the gradients.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源