论文标题

基于超网络的增强

Hypernetwork-Based Augmentation

论文作者

Chen, Chih-Yang, Chang, Che-Han

论文摘要

数据增强是提高深神经网络概括的有效技术。最近,Autoaughment提出了一个精心设计的搜索空间和一种搜索算法,该算法自动以数据驱动的方式找到了增强策略。但是,自动仪在计算上是密集型的。在本文中,我们提出了一种有效的基于梯度的搜索算法,称为基于超网络的增强(HBA),该算法同时在单个培训中学习了模型参数和增强超参数。我们的HBA使用超网络来近似基于人群的培训算法,这使我们能够通过梯度下降来调整增强超参数。此外,我们引入了一种重量共享策略,该策略简化了我们的超网络体系结构并加快了搜索算法。我们在CIFAR-10,CIFAR-100,SVHN和Imagenet上进行实验。我们的结果表明,就搜索速度和准确性而言,HBA与最新方法具有竞争力。

Data augmentation is an effective technique to improve the generalization of deep neural networks. Recently, AutoAugment proposed a well-designed search space and a search algorithm that automatically finds augmentation policies in a data-driven manner. However, AutoAugment is computationally intensive. In this paper, we propose an efficient gradient-based search algorithm, called Hypernetwork-Based Augmentation (HBA), which simultaneously learns model parameters and augmentation hyperparameters in a single training. Our HBA uses a hypernetwork to approximate a population-based training algorithm, which enables us to tune augmentation hyperparameters by gradient descent. Besides, we introduce a weight sharing strategy that simplifies our hypernetwork architecture and speeds up our search algorithm. We conduct experiments on CIFAR-10, CIFAR-100, SVHN, and ImageNet. Our results show that HBA is competitive to the state-of-the-art methods in terms of both search speed and accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源