论文标题

流量 - 连续变压器:一种自然语言处理的框架,用于网络范围的流量预测

Traffic-Twitter Transformer: A Nature Language Processing-joined Framework For Network-wide Traffic Forecasting

论文作者

Tsai, Meng-Ju, Cui, Zhiyong, Yang, Hao, Kopca, Cole, Tien, Sophie, Wang, Yinhai

论文摘要

有了准确,及时的流量预测,可以预测受影响的交通状况可指导机构和居民适当地应对交通模式的变化。但是,例如,关于交通预测的现有作品主要依赖于仅在1小时以下的短期预测的历史流量模式。为了更好地管理未来的道路能力并适应社会和人类的影响,提出一个灵活,全面的框架以预测公共用户和运输机构的长期交通状况至关重要。在本文中,考虑到社交媒体的功能,弥合了强大的长期交通预测的差距。首先实施了相关研究和线性回归模型,以评估两个时间序列数据,流量强度和Twitter数据强度之间的相关性的重要性。然后将两个时间序列数据提供给我们提出的社会意识框架,即交通 - 扭转变压器,该框架将大自然语言表示形式集成到时间序列记录中以进行长期流量预测。大西雅图地区的实验结果表明,我们提出的模型在所有评估矩阵中都优于基线模型。这个由NLP加入的社会感知框架可以成为交通代理机构的网络交通预测和管理的宝贵实现。

With accurate and timely traffic forecasting, the impacted traffic conditions can be predicted in advance to guide agencies and residents to respond to changes in traffic patterns appropriately. However, existing works on traffic forecasting mainly relied on historical traffic patterns confining to short-term prediction, under 1 hour, for instance. To better manage future roadway capacity and accommodate social and human impacts, it is crucial to propose a flexible and comprehensive framework to predict physical-aware long-term traffic conditions for public users and transportation agencies. In this paper, the gap of robust long-term traffic forecasting was bridged by taking social media features into consideration. A correlation study and a linear regression model were first implemented to evaluate the significance of the correlation between two time-series data, traffic intensity and Twitter data intensity. Two time-series data were then fed into our proposed social-aware framework, Traffic-Twitter Transformer, which integrated Nature Language representations into time-series records for long-term traffic prediction. Experimental results in the Great Seattle Area showed that our proposed model outperformed baseline models in all evaluation matrices. This NLP-joined social-aware framework can become a valuable implement of network-wide traffic prediction and management for traffic agencies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源