论文标题

通过知识重建的终身生成学习

Lifelong Generative Learning via Knowledge Reconstruction

论文作者

Huang, Libo, An, Zhulin, Zhi, Xiang, Xu, Yongjun

论文摘要

当生成模型被用来依次学习多个任务(即终身生成学习)时,通常会导致灾难性的遗忘问题。尽管有一些努力可以解决这个问题,但它们遭受了高度消费或误差积累的困扰。在这项工作中,我们开发了一种基于变异自动编码器(VAE)的有效有效的终身生成模型。与生成的对抗网络不同,VAE在培训过程中具有很高的效率,从而为很少的资源提供了自然的好处。我们通过将VAE的固有重建特征归于历史知识保留,从而推断出终生的生成模型。此外,我们设计了有关重建数据的反馈策略,以减轻误差积累。关于生成MNIST,FashionMnist和SVHN的终生任务的实验验证了我们方法的功效,其中结果与SOTA相当。

Generative models often incur the catastrophic forgetting problem when they are used to sequentially learning multiple tasks, i.e., lifelong generative learning. Although there are some endeavors to tackle this problem, they suffer from high time-consumptions or error accumulation. In this work, we develop an efficient and effective lifelong generative model based on variational autoencoder (VAE). Unlike the generative adversarial network, VAE enjoys high efficiency in the training process, providing natural benefits with few resources. We deduce a lifelong generative model by expending the intrinsic reconstruction character of VAE to the historical knowledge retention. Further, we devise a feedback strategy about the reconstructed data to alleviate the error accumulation. Experiments on the lifelong generating tasks of MNIST, FashionMNIST, and SVHN verified the efficacy of our approach, where the results were comparable to SOTA.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源