论文标题

变压器病变跟踪器

Transformer Lesion Tracker

论文作者

Tang, Wen, Kang, Han, Zhang, Haoyue, Yu, Pengxin, Arnold, Corey W., Zhang, Rongguo

论文摘要

通过纵向病变跟踪评估病变进展和治疗反应在临床实践中起着至关重要的作用。当手动匹配病变匹配时,该任务的自动化方法是由劳动力成本和时间消耗过高的动机。以前的方法通常缺乏本地和全球信息的集成。在这项工作中,我们提出了一种基于变压器的方法,称为变压器病变跟踪器(TLT)。具体而言,我们设计了一个基于注意力的变压器(CAT),以捕获和组合全球和本地信息以增强特征提取。我们还开发了一个基于注册的解剖注意模块(RAAM),以向CAT介绍解剖信息,以便可以专注于有用的特征知识。提出了一种稀疏的选择策略(SSS),用于选择特征和减少变压器训练中的内存足迹。此外,我们使用全球回归来进一步提高模型性能。我们在公共数据集上进行实验,以显示我们方法的优越性,并发现我们的模型性能将平均欧几里得中心误差提高了至少14.3%(6mm vs. 7mm),而不是先进的ART(SOTA)。代码可在https://github.com/tangwen920812/tlt上找到。

Evaluating lesion progression and treatment response via longitudinal lesion tracking plays a critical role in clinical practice. Automated approaches for this task are motivated by prohibitive labor costs and time consumption when lesion matching is done manually. Previous methods typically lack the integration of local and global information. In this work, we propose a transformer-based approach, termed Transformer Lesion Tracker (TLT). Specifically, we design a Cross Attention-based Transformer (CAT) to capture and combine both global and local information to enhance feature extraction. We also develop a Registration-based Anatomical Attention Module (RAAM) to introduce anatomical information to CAT so that it can focus on useful feature knowledge. A Sparse Selection Strategy (SSS) is presented for selecting features and reducing memory footprint in Transformer training. In addition, we use a global regression to further improve model performance. We conduct experiments on a public dataset to show the superiority of our method and find that our model performance has improved the average Euclidean center error by at least 14.3% (6mm vs. 7mm) compared with the state-of-the-art (SOTA). Code is available at https://github.com/TangWen920812/TLT.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源