论文标题

无线胶囊内窥镜检查中胃肠道疾病的基于SURF-SVM的识别和分类

SURF-SVM Based Identification and Classification of Gastrointestinal Diseases in Wireless Capsule Endoscopy

论文作者

Vats, Vanshika, Goel, Pooja, Agarwal, Amodini, Goel, Nidhi

论文摘要

内窥镜检查为胃肠道(GIT)疾病的诊断提供了重大贡献。由于结肠内窥镜具有一定的局限性,无线胶囊内窥镜检查以易于和效率的术语逐渐接管了它。 WCE是用微型光学内窥镜进行的,该光学内窥镜被患者吞咽,并在患者体内的GIT过程中无线传输颜色图像。这些图像用于实施一种有效且有效的方法,该方法旨在自动检测GIT中的异常和正常组织,从而有助于减少审阅者的手动工作。该算法进一步旨在将患病的组织分类为各种GIT疾病,这些疾病通常会影响该区域。在此手稿中,用于检测兴趣点的描述符是加快稳健特征(冲浪)的速度,该功能使用图像中包含的颜色信息,该图像转换为CIELAB空间颜色以更好地识别。然后使用在兴趣点提取的功能训练和测试支持向量机(SVM),以便将图像自动分类为正常或异常,并进一步检测到特定的异常。 SVM以及几个参数的精度为94.58%,同时分类正常和异常图像,精度为82.91%,同时分类为多类。目前的工作是对先前报道的分析的改进,这些分析仅限于使用这种方法进行双级分类。

Endoscopy provides a major contribution to the diagnosis of the Gastrointestinal Tract (GIT) diseases. With Colon Endoscopy having its certain limitations, Wireless Capsule Endoscopy is gradually taking over it in the terms of ease and efficiency. WCE is performed with a miniature optical endoscope which is swallowed by the patient and transmits colour images wirelessly during its journey through the GIT, inside the body of the patient. These images are used to implement an effective and computationally efficient approach which aims to detect the abnormal and normal tissues in the GIT automatically, and thus helps in reducing the manual work of the reviewers. The algorithm further aims to classify the diseased tissues into various GIT diseases that are commonly known to be affecting the tract. In this manuscript, the descriptor used for the detection of the interest points is Speeded Up Robust Features (SURF), which uses the colour information contained in the images which is converted to CIELAB space colours for better identification. The features extracted at the interest points are then used to train and test a Support Vector Machine (SVM), so that it automatically classifies the images into normal or abnormal and further detects the specific abnormalities. SVM, along with a few parameters, gives a very high accuracy of 94.58% while classifying normal and abnormal images and an accuracy of 82.91% while classifying into multi-class. The present work is an improvement on the previously reported analyses which were only limited to the bi-class classification using this approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源