论文标题

基于分类器的特征蒸馏加速扩散采样

Accelerating Diffusion Sampling with Classifier-based Feature Distillation

论文作者

Sun, Wujie, Chen, Defang, Wang, Can, Ye, Deshi, Feng, Yan, Chen, Chun

论文摘要

尽管扩散模型显示出比gan产生更高质量图像的巨大潜力,但缓慢的采样速度阻碍了其在实践中的广泛应用。因此,提出了渐进式蒸馏,以通过将$ n $ step教师采样器的输出图像与$ n/2 $ - 步骤的学生采样器进行对齐。在本文中,我们认为可以进一步改进这种基于蒸馏的加速方法,尤其是对于几步的采样器,我们提出的\ textbf {c}基于slassifier的\ textbf {f} eature \ textbf {d textbf {d} iStillation(cfd)。我们没有使输出图像对齐,而是通过独立于数据集的分类器将老师的特征分布提炼为学生,从而使学生专注于这些重要功能以提高性能。我们还引入了面向数据集的损失,以进一步优化模型。 CIFAR-10上的实验显示了我们方法在获得高质量和快速采样方面的优越性。代码以\ url {https://github.com/zju-swj/rcfd}提供。

Although diffusion model has shown great potential for generating higher quality images than GANs, slow sampling speed hinders its wide application in practice. Progressive distillation is thus proposed for fast sampling by progressively aligning output images of $N$-step teacher sampler with $N/2$-step student sampler. In this paper, we argue that this distillation-based accelerating method can be further improved, especially for few-step samplers, with our proposed \textbf{C}lassifier-based \textbf{F}eature \textbf{D}istillation (CFD). Instead of aligning output images, we distill teacher's sharpened feature distribution into the student with a dataset-independent classifier, making the student focus on those important features to improve performance. We also introduce a dataset-oriented loss to further optimize the model. Experiments on CIFAR-10 show the superiority of our method in achieving high quality and fast sampling. Code is provided at \url{https://github.com/zju-SWJ/RCFD}.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源