论文标题

相关复发单元:一种用于提高时间序列数据预测性能的新型神经体系结构

Correlation recurrent units: A novel neural architecture for improving the predictive performance of time-series data

论文作者

Sim, Sunghyun, Kim, Dohee, Bae, Hyerim

论文摘要

时间序列预测(TSF)问题是人工智能领域的传统问题。诸如复发性神经网络(RNN),长期短期记忆(LSTM)和GRU(栅极复发单元)等模型有助于提高TSF的预测准确性。此外,已经提出了模型结构来结合时间序列分解方法,例如使用Loess(STL)进行季节性趋势分解,以确保提高预测精度。但是,由于这种方法是在每个组件的独立模型中学习的,因此无法学习时间序列组件之间的关系。在这项研究中,我们提出了一种称为相关复发单元(CRU)的新神经体系结构,该神经结构可以在每个分解成分之间执行时间序列分解并学习相关性(自相关和相关性)。通过使用五个单变量时间序列数据集和四个多元时间序列数据进行的比较实验,评估了所提出的神经体系结构。结果表明,长期和短期预测性能提高了10%以上。实验结果表明,与其他神经体系结构相比,提出的CRU是TSF问题的绝佳方法。

The time-series forecasting (TSF) problem is a traditional problem in the field of artificial intelligence. Models such as Recurrent Neural Network (RNN), Long Short Term Memory (LSTM), and GRU (Gate Recurrent Units) have contributed to improving the predictive accuracy of TSF. Furthermore, model structures have been proposed to combine time-series decomposition methods, such as seasonal-trend decomposition using Loess (STL) to ensure improved predictive accuracy. However, because this approach is learned in an independent model for each component, it cannot learn the relationships between time-series components. In this study, we propose a new neural architecture called a correlation recurrent unit (CRU) that can perform time series decomposition within a neural cell and learn correlations (autocorrelation and correlation) between each decomposition component. The proposed neural architecture was evaluated through comparative experiments with previous studies using five univariate time-series datasets and four multivariate time-series data. The results showed that long- and short-term predictive performance was improved by more than 10%. The experimental results show that the proposed CRU is an excellent method for TSF problems compared to other neural architectures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源