论文标题

半监督的学习相互加速的MRI合成而没有完全采样的地面真相

Semi-Supervised Learning of Mutually Accelerated MRI Synthesis without Fully-Sampled Ground Truths

论文作者

Yurt, Mahmut, Dar, Salman Ul Hassan, Özbey, Muzaffer, Tınaz, Berk, Oğuz, Kader Karlı, Çukur, Tolga

论文摘要

基于学习的合成多对比度MRI通常涉及使用源和目标对比的高质量图像训练的深层模型,而不管源和目标域样品是配对还是不配对。这导致不良依赖所有MRI对比的收购,由于扫描成本和时间的限制,这可能是不切实际的。在这里,我们提出了一种新型的半监督深度生成模型,而是学会了直接从加速源和目标对比的加速获取中恢复高质量的目标图像。为了实现这一目标,提出的模型在图像,K空间和对抗域中引入了新型的多线圈张量损失。这些选择性损失仅基于获得的K空间样本,并且在受试者之间使用随机采样掩码来捕获所获得和非可获得的K空间区域之间的关系。对多对比度神经成像数据集进行的全面实验表明,我们的半监督方法对金标准的全面监督模型产生等效性能,同时表现优于层叠的方法,该方法学会学会根据未取代数据的重建进行合成。因此,提出的方法具有巨大的希望,可以提高在对比度和K空间中相互采样的加速MRI采集的可行性和实用性。

Learning-based synthetic multi-contrast MRI commonly involves deep models trained using high-quality images of source and target contrasts, regardless of whether source and target domain samples are paired or unpaired. This results in undesirable reliance on fully-sampled acquisitions of all MRI contrasts, which might prove impractical due to limitations on scan costs and time. Here, we propose a novel semi-supervised deep generative model that instead learns to recover high-quality target images directly from accelerated acquisitions of source and target contrasts. To achieve this, the proposed model introduces novel multi-coil tensor losses in image, k-space and adversarial domains. These selective losses are based only on acquired k-space samples, and randomized sampling masks are used across subjects to capture relationships among acquired and non-acquired k-space regions. Comprehensive experiments on multi-contrast neuroimaging datasets demonstrate that our semi-supervised approach yields equivalent performance to gold-standard fully-supervised models, while outperforming a cascaded approach that learns to synthesize based on reconstructions of undersampled data. Therefore, the proposed approach holds great promise to improve the feasibility and utility of accelerated MRI acquisitions mutually undersampled across both contrast sets and k-space.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源