论文标题
COV-TI-NET:转移的初始化,用于COVID-19诊断的修饰端层
CoV-TI-Net: Transferred Initialization with Modified End Layer for COVID-19 Diagnosis
论文作者
论文摘要
本文提议使用修改的完全连接的层转移初始化,以进行1900诊断。卷积神经网络(CNN)在图像分类中取得了显着的结果。但是,由于图像识别应用程序的复杂性,培训高性能模型是一个非常复杂且耗时的过程。另一方面,转移学习是一种相对较新的学习方法,在许多领域都采用了以更少的计算来实现良好性能。在这项研究中,Pytorch预训练的模型(VGG19 \ _bn和WideresNet -101)首次在MNIST数据集中应用于初始化,并具有修改的完全连接的层。先前在Imagenet中对使用的Pytorch预训练模型进行了培训。提出的模型在Kaggle笔记本中得到了开发和验证,它达到了99.77%的出色准确性,而在网络训练过程中没有大量的计算时间。我们还将相同的方法应用于SIIM-FISABIO-RSNA COVID-19检测数据集,并达到了80.01%的精度。相反,以前的方法在训练过程中需要巨大的压缩时间才能达到高性能模型。代码可在以下链接中找到:github.com/dipuk0506/spinalnet
This paper proposes transferred initialization with modified fully connected layers for COVID-19 diagnosis. Convolutional neural networks (CNN) achieved a remarkable result in image classification. However, training a high-performing model is a very complicated and time-consuming process because of the complexity of image recognition applications. On the other hand, transfer learning is a relatively new learning method that has been employed in many sectors to achieve good performance with fewer computations. In this research, the PyTorch pre-trained models (VGG19\_bn and WideResNet -101) are applied in the MNIST dataset for the first time as initialization and with modified fully connected layers. The employed PyTorch pre-trained models were previously trained in ImageNet. The proposed model is developed and verified in the Kaggle notebook, and it reached the outstanding accuracy of 99.77% without taking a huge computational time during the training process of the network. We also applied the same methodology to the SIIM-FISABIO-RSNA COVID-19 Detection dataset and achieved 80.01% accuracy. In contrast, the previous methods need a huge compactional time during the training process to reach a high-performing model. Codes are available at the following link: github.com/dipuk0506/SpinalNet