论文标题

DC-Cyclegan:未配对数据的双向CT-TO-MR合成

DC-cycleGAN: Bidirectional CT-to-MR Synthesis from Unpaired Data

论文作者

Wang, Jiayuan, Wu, Q. M. Jonathan, Pourpanah, Farhad

论文摘要

磁共振(MR)和计算机断层扫描(CT)图像是两种典型的医学图像类型,可提供相互融合的信息,以进行准确的临床诊断和治疗。但是,由于某些考虑因素,例如成本,辐射剂量和方式缺失,因此获得这两个图像可能会受到限制。最近,医学图像合成引起了人们的兴趣以应对这种限制。在本文中,我们提出了一个双向学习模型,称为双重对比自行车(DC-Cyclegan),以从未配对的数据中综合医学图像。具体而言,将双重对比损失引入鉴别器中,以通过利用来自源域的样本作为负样本来间接构建实际源和合成图像之间的约束,并强制实施合成图像以远离源域。另外,将跨凝结和结构相似性指数(SSIM)集成到DC-Cyclegan中,以便在合成图像时考虑样品的亮度和结构。实验结果表明,与其他基于自行车的医学图像合成方法(例如Cyclegan,Reggan,Dualgan和Nicegan)相比,DC-Cyclegan能够产生有希望的结果。该代码将在https://github.com/jiayuanwang-jw/dc-cyclegan上找到。

Magnetic resonance (MR) and computer tomography (CT) images are two typical types of medical images that provide mutually-complementary information for accurate clinical diagnosis and treatment. However, obtaining both images may be limited due to some considerations such as cost, radiation dose and modality missing. Recently, medical image synthesis has aroused gaining research interest to cope with this limitation. In this paper, we propose a bidirectional learning model, denoted as dual contrast cycleGAN (DC-cycleGAN), to synthesize medical images from unpaired data. Specifically, a dual contrast loss is introduced into the discriminators to indirectly build constraints between real source and synthetic images by taking advantage of samples from the source domain as negative samples and enforce the synthetic images to fall far away from the source domain. In addition, cross-entropy and structural similarity index (SSIM) are integrated into the DC-cycleGAN in order to consider both the luminance and structure of samples when synthesizing images. The experimental results indicate that DC-cycleGAN is able to produce promising results as compared with other cycleGAN-based medical image synthesis methods such as cycleGAN, RegGAN, DualGAN, and NiceGAN. The code will be available at https://github.com/JiayuanWang-JW/DC-cycleGAN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源