论文标题

自动脑肿瘤分割的2D UNET的注意力引导版本

Attention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation

论文作者

Noori, Mehrdad, Bahri, Ali, Mohammadi, Karim

论文摘要

神经胶质瘤是脑肿瘤中最常见和最具侵略性的,这会导致其最高级别的预期寿命短。因此,治疗评估是提高患者生活质量的关键阶段。最近,深度卷积神经网络(DCNN)在脑肿瘤分割方面取得了显着性能,但是由于胶质瘤的强度和外观高,此任务仍然很困难。大多数现有方法,尤其是基于UNET的网络,都以幼稚的方式集成了低级和高级功能,这可能会使模型混淆。此外,大多数方法都采用3D体系结构从输入图像的3D上下文信息中受益。这些体系结构包含比2D体系结构更多的参数和计算复杂性。另一方面,使用2D模型导致不从输入图像的3D上下文信息中受益。为了解决上述问题,我们基于2D UNET设计了一个低参数网络,其中我们采用了两种技术。第一种技术是一种注意机制,在串联低水平和高级特征后采用了注意力机制。该技术通过自适应加权每个通道来防止模型的混乱。第二种技术是多视图融合。通过采用这种技术,尽管使用了2D模型,我们仍可以从输入图像的3D上下文信息中受益。实验结果表明,我们的方法对2017年和2018年最先进的方法有利。

Gliomas are the most common and aggressive among brain tumors, which cause a short life expectancy in their highest grade. Therefore, treatment assessment is a key stage to enhance the quality of the patients' lives. Recently, deep convolutional neural networks (DCNNs) have achieved a remarkable performance in brain tumor segmentation, but this task is still difficult owing to high varying intensity and appearance of gliomas. Most of the existing methods, especially UNet-based networks, integrate low-level and high-level features in a naive way, which may result in confusion for the model. Moreover, most approaches employ 3D architectures to benefit from 3D contextual information of input images. These architectures contain more parameters and computational complexity than 2D architectures. On the other hand, using 2D models causes not to benefit from 3D contextual information of input images. In order to address the mentioned issues, we design a low-parameter network based on 2D UNet in which we employ two techniques. The first technique is an attention mechanism, which is adopted after concatenation of low-level and high-level features. This technique prevents confusion for the model by weighting each of the channels adaptively. The second technique is the Multi-View Fusion. By adopting this technique, we can benefit from 3D contextual information of input images despite using a 2D model. Experimental results demonstrate that our method performs favorably against 2017 and 2018 state-of-the-art methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源