论文标题
RTFN:时间序列分类的强大时间功能网络
RTFN: A Robust Temporal Feature Network for Time Series Classification
论文作者
论文摘要
时间序列数据通常包含本地和全局模式。大多数现有功能网络都会更加关注本地特征,而不是它们之间的关系。但是,后者也很重要,更难以探索。通过功能网络获得足够的表示仍然具有挑战性。为此,我们提出了一个新颖的鲁棒时间特征网络(RTFN),以进行时间序列分类中的特征提取,其中包含时间特征网络(TFN)和一个基于LSTM的注意力网络(LSTMAN)。 TFN是具有多个卷积层的残留结构。它用作局部特征提取网络,从数据中挖掘出足够的本地特征。 LSTMAN由两个相同的层组成,其中关注和长期记忆(LSTM)网络杂交。该网络充当关系提取网络,以发现顺序数据中不同位置提取的特征之间的内在关系。在实验中,我们分别将RTFN嵌入了有监督的结构中,并分别以编码器的形式嵌入无监督的结构中。结果表明,基于RTFN的结构在大量UCR2018和UEA2018数据集上实现了出色的监督和无监督的性能。
Time series data usually contains local and global patterns. Most of the existing feature networks pay more attention to local features rather than the relationships among them. The latter is, however, also important yet more difficult to explore. To obtain sufficient representations by a feature network is still challenging. To this end, we propose a novel robust temporal feature network (RTFN) for feature extraction in time series classification, containing a temporal feature network (TFN) and an LSTM-based attention network (LSTMaN). TFN is a residual structure with multiple convolutional layers. It functions as a local-feature extraction network to mine sufficient local features from data. LSTMaN is composed of two identical layers, where attention and long short-term memory (LSTM) networks are hybridized. This network acts as a relation extraction network to discover the intrinsic relationships among the extracted features at different positions in sequential data. In experiments, we embed RTFN into a supervised structure as a feature extractor and into an unsupervised structure as an encoder, respectively. The results show that the RTFN-based structures achieve excellent supervised and unsupervised performance on a large number of UCR2018 and UEA2018 datasets.