论文标题
朝着长度和噪声射频指纹标识
Towards Length-Versatile and Noise-Robust Radio Frequency Fingerprint Identification
论文作者
论文摘要
射频指纹识别(RFFI)可以通过分析由固有硬件障碍引起的信号扭曲来对无线设备进行分类。 RFFI采用了最新的神经网络。但是,许多神经网络,例如多层感知器(MLP)和卷积神经网络(CNN)都需要固定大小的输入数据。此外,许多物联网设备以低信噪比(SNR)场景工作,但是在这种情况下的RFFI性能很少进行。在本文中,我们分析了基于MLP和CNN的RFFI系统受输入大小约束的原因。为了克服这一点,我们提出了四个可以处理可变长度信号的神经网络,即无扁平的CNN,长期记忆(LSTM)网络,封闭式复发单元(GRU)网络和变压器。我们在培训过程中采用数据扩展,这可以显着提高模型对噪声的鲁棒性。我们比较了两个增强计划,即离线和在线增强。结果表明在线表现更好。在推断期间,多包推理方法将进一步利用,以提高低SNR方案的分类精度。我们将洛拉(Lora)作为案例研究,并通过在各种SNR条件下对10个商业的LORA设备进行分类来评估系统。在线增强可以提高低SNR分类精度高达50%,而多包推理方法可以将准确性进一步提高20%以上。
Radio frequency fingerprint identification (RFFI) can classify wireless devices by analyzing the signal distortions caused by the intrinsic hardware impairments. State-of-the-art neural networks have been adopted for RFFI. However, many neural networks, e.g., multilayer perceptron (MLP) and convolutional neural network (CNN), require fixed-size input data. In addition, many IoT devices work in low signal-to-noise ratio (SNR) scenarios but the RFFI performance in such scenarios is rarely investigated. In this paper, we analyze the reason why MLP- and CNN-based RFFI systems are constrained by the input size. To overcome this, we propose four neural networks that can process signals of variable lengths, namely flatten-free CNN, long short-term memory (LSTM) network, gated recurrent unit (GRU) network and transformer. We adopt data augmentation during training which can significantly improve the model's robustness to noise. We compare two augmentation schemes, namely offline and online augmentation. The results show the online one performs better. During the inference, a multi-packet inference approach is further leveraged to improve the classification accuracy in low SNR scenarios. We take LoRa as a case study and evaluate the system by classifying 10 commercial-off-the-shelf LoRa devices in various SNR conditions. The online augmentation can boost the low-SNR classification accuracy by up to 50% and the multi-packet inference approach can further increase the accuracy by over 20%.