论文标题

密度插入层:自适应接受场的一般框架

Density-embedding layers: a general framework for adaptive receptive fields

论文作者

Cicala, Francesco, Bortolussi, Luca

论文摘要

人工神经网络的有效性和性能,特别是对于视觉任务,取决于神经元的接受领域。接受场本身取决于几个架构方面之间的相互作用,包括稀疏性,合并和激活功能。在最近的文献中,有一些临时建议,试图使接受场更加灵活和适应数据。例如,已经提出了卷积和合并层的不同参数化以提高其适应性。在本文中,我们提出了密度插入层的新型理论框架,从而推广了由神经元代表的转化。具体而言,在输入上应用的仿射转换被输入的标量乘积取代,可正当表示为分段常数函数,并具有与神经元相关的密度函数。该密度显示出直接描述神经元的接受场。至关重要的是,通过适当地代表参数功能家族的线性组合的密度,我们可以通过任何自动分化系统有效地训练密度,从而使其适应眼前的问题,并在计算上有效评估。该框架捕获并概括了最近的方法,从而可以对接收场进行微调。在论文中,我们定义了一些新颖的层,并在经典的MNIST数据集中对它们进行了实验验证。

The effectiveness and performance of artificial neural networks, particularly for visual tasks, depends in crucial ways on the receptive field of neurons. The receptive field itself depends on the interplay between several architectural aspects, including sparsity, pooling, and activation functions. In recent literature there are several ad hoc proposals trying to make receptive fields more flexible and adaptive to data. For instance, different parameterizations of convolutional and pooling layers have been proposed to increase their adaptivity. In this paper, we propose the novel theoretical framework of density-embedded layers, generalizing the transformation represented by a neuron. Specifically, the affine transformation applied on the input is replaced by a scalar product of the input, suitably represented as a piecewise constant function, with a density function associated with the neuron. This density is shown to describe directly the receptive field of the neuron. Crucially, by suitably representing such a density as a linear combination of a parametric family of functions, we can efficiently train the densities by means of any automatic differentiation system, making it adaptable to the problem at hand, and computationally efficient to evaluate. This framework captures and generalizes recent methods, allowing a fine tuning of the receptive field. In the paper, we define some novel layers and we experimentally validate them on the classic MNIST dataset.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源