论文标题
EEG-TCNET:用于嵌入式运动象征脑机界面的精确时间卷积网络
EEG-TCNet: An Accurate Temporal Convolutional Network for Embedded Motor-Imagery Brain-Machine Interfaces
论文作者
论文摘要
近年来,深度学习(DL)为基于脑电图(EEG)的运动象征脑机界面(MI-BMI)的改善做出了重大贡献。在达到高分类精度的同时,DL模型的尺寸也已增长,需要大量的内存和计算资源。这对嵌入式BMI解决方案构成了重大挑战,该解决方案通过在本地处理数据来确保用户隐私,减少潜伏期和低功耗。在本文中,我们提出了EEG-TCNET,这是一种新型的时间卷积网络(TCN),可实现出色的准确性,同时需要很少的可训练参数。它的低内存足迹和用于推理的计算复杂性低,使其适用于边缘的资源有限设备上的嵌入式分类。 BCI竞争IV-2A数据集的实验结果表明,EEG-TCNET在4级MI中达到77.35%的分类精度。通过每个受试者找到最佳网络超参数,我们将准确性进一步提高到83.84%。最后,我们在所有BCI基准(MOABB)的母亲上演示了EEG-TCNET的多功能性,这是一种大型测试基准,其中包含12种不同的EEG数据集和MI实验。结果表明,EEG-TCNET成功概括了一个数据集,超出了一个数据集,超出了MOABB上最新的ART(SOA),其元效应为0.25。
In recent years, deep learning (DL) has contributed significantly to the improvement of motor-imagery brain-machine interfaces (MI-BMIs) based on electroencephalography(EEG). While achieving high classification accuracy, DL models have also grown in size, requiring a vast amount of memory and computational resources. This poses a major challenge to an embedded BMI solution that guarantees user privacy, reduced latency, and low power consumption by processing the data locally. In this paper, we propose EEG-TCNet, a novel temporal convolutional network (TCN) that achieves outstanding accuracy while requiring few trainable parameters. Its low memory footprint and low computational complexity for inference make it suitable for embedded classification on resource-limited devices at the edge. Experimental results on the BCI Competition IV-2a dataset show that EEG-TCNet achieves 77.35% classification accuracy in 4-class MI. By finding the optimal network hyperparameters per subject, we further improve the accuracy to 83.84%. Finally, we demonstrate the versatility of EEG-TCNet on the Mother of All BCI Benchmarks (MOABB), a large scale test benchmark containing 12 different EEG datasets with MI experiments. The results indicate that EEG-TCNet successfully generalizes beyond one single dataset, outperforming the current state-of-the-art (SoA) on MOABB by a meta-effect of 0.25.