论文标题

Autogan-Distiller:搜索以压缩生成的对抗网络

AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks

论文作者

Fu, Yonggan, Chen, Wuyang, Wang, Haotao, Li, Haoran, Lin, Yingyan Celine, Wang, Zhangyang

论文摘要

由于对众多应用程序(例如图像翻译,增强和编辑)将GAN部署到移动设备中的需求不断增加,因此最近引起了生成对抗网络(GAN)的压缩。但是,与压缩其他深层模型的实质性努力相比,关于压缩gan(通常是发电机)的研究仍处于起步阶段。现有的GAN压缩算法仅限于处理特定的GAN架构和损失。受汽车在深层压缩中的成功启发的启发,我们将Automl引入了GAN压缩并开发了Autogan-Distiller(AGD)框架。从专门设计的有效搜索空间开始,AGD在给定目标计算资源约束时,为新的高效发电机执行了端到端发现。搜索是由原始的GAN模型通过知识蒸馏而指导的,因此可以实现压缩。 AGD是全自动的,独立的(即不需要训练有素的歧视器),并且通常适用于各种GAN模型。我们在两个代表性的GAN任务中评估AGD:图像翻译和超级分辨率。没有铃铛和哨子,AGD产生了非常轻巧,更具竞争力的压缩模型,这在很大程度上要优于现有的替代品。我们的代码和预估计的模型可在https://github.com/tamu-vita/agd上找到。

The compression of Generative Adversarial Networks (GANs) has lately drawn attention, due to the increasing demand for deploying GANs into mobile devices for numerous applications such as image translation, enhancement and editing. However, compared to the substantial efforts to compressing other deep models, the research on compressing GANs (usually the generators) remains at its infancy stage. Existing GAN compression algorithms are limited to handling specific GAN architectures and losses. Inspired by the recent success of AutoML in deep compression, we introduce AutoML to GAN compression and develop an AutoGAN-Distiller (AGD) framework. Starting with a specifically designed efficient search space, AGD performs an end-to-end discovery for new efficient generators, given the target computational resource constraints. The search is guided by the original GAN model via knowledge distillation, therefore fulfilling the compression. AGD is fully automatic, standalone (i.e., needing no trained discriminators), and generically applicable to various GAN models. We evaluate AGD in two representative GAN tasks: image translation and super resolution. Without bells and whistles, AGD yields remarkably lightweight yet more competitive compressed models, that largely outperform existing alternatives. Our codes and pretrained models are available at https://github.com/TAMU-VITA/AGD.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源