论文标题

卷积网络密集连接

Convolutional Networks with Dense Connectivity

论文作者

Huang, Gao, Liu, Zhuang, Pleiss, Geoff, van der Maaten, Laurens, Weinberger, Kilian Q.

论文摘要

最近的工作表明,如果卷积网络在靠近输入和接近输出的层之间的层之间的连接较短,则可以更深入,更准确和训练训练。在本文中,我们采用了这一观察结果并介绍了密集的卷积网络(Densenet),该网络将每一层连接到其他层以馈送方式连接到其他层。传统的带有L层的卷积网络具有L连接 - 每个层之间的连接 - 我们的后续层之间 - 我们的网络具有L(L+1)/2个直接连接。对于每一层,所有前面图层的特征映射都用作输入,并且其自己的特征映射用作输入到所有后续层中。登录具有几个令人信服的优势:它们可以减轻消失的梯度问题,鼓励重复使用并大大提高参数效率。我们在四个竞争激烈的对象识别基准任务(CIFAR-10,CIFAR-100,SVHN和Imagenet)上评估了我们提出的架构。密enet在大多数的最新情况下获得了显着改进,同时需要更少的参数和计算才能实现高性能。

Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output. In this paper, we embrace this observation and introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion.Whereas traditional convolutional networks with L layers have L connections - one between each layer and its subsequent layer - our network has L(L+1)/2 direct connections. For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers. DenseNets have several compelling advantages: they alleviate the vanishing-gradient problem, encourage feature reuse and substantially improve parameter efficiency. We evaluate our proposed architecture on four highly competitive object recognition benchmark tasks (CIFAR-10, CIFAR-100, SVHN, and ImageNet). DenseNets obtain significant improvements over the state-of-the-art on most of them, whilst requiring less parameters and computation to achieve high performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源