基于Attention-Conv1D-2Bi-LSTM模型的交通流预测  

Traffic Flow Prediction Based on Attention Conv1D-2Bi-LSTM Model

作  者:张瑜 刘德斌 戴志敏 杨子兰 ZHANG Yu;LIU De-bin;DAI Zhi-min;YANG Zi-lan(Information College,Lijiang Cultural Tourism College,Lijiang Yunnan 674199,China;Center for Data Science,Peking University,Beijing 100871,China;College of Mathematical Sciences,Peking University,Beijing 100871,China;College of Foundation,Xi'an University of Technology,Xi'an Shanxi 710021,China)

机构地区:[1]丽江文化旅游学院信息学院,云南丽江674199 [2]北京大学大数据科学研究中心,北京100871 [3]北京大学数学科学学院,北京100871 [4]西安工业大学基础学院,陕西西安710021

出  处:《计算机仿真》2025年第2期181-186,共6页Computer Simulation

基  金:国家自然科学基金重点项目(11831002);云南省教育厅科学研究基金项目(2019J0245,2022J1217)。

摘  要:在智能交通中,实时准确的交通流预测对市民的出行和政府部门的管理至关重要。针对智能交通预测效果不佳的问题,提出了一种基于注意力机制的一维卷积和双层双向长短时记忆的交通流预测模型。模型结合了一维卷积模块和两层双向长短时记忆模块提取交通流的时空特征和前后依赖的周期性特征,同时引人注意力机制关注不同时刻的交通流的影响。实验结果表明,提出模型的预测效果优于对比模型,说明所提模型一定程度上提高了交通流的预测精度。In intelligent transportation,real-time and accurate prediction of traffic flow is crucial for citizens travel and the government management.To address the issue of poor performance of intelligent traffic prediction,this paper proposed a traffic flow prediction model that combines one-dimensional convolutional and double-layer bidirectional short-term memory neural network with an attention mechanism.This model extracts the spatial-temporal features of traffic flow and the periodicity feature of temporal dependencies using the one-dimensional convolutional module and two-layer bidirectional short-term memory module,while incorporating attention mechanism to focus on the influence of traffic flow at different time steps.The experimental results show that the proposed model outperforms the comparison models,indicating that the proposed model in this paper improves the accuracy of traffic flow prediction to a certain extent.

关 键 词:注意力机制 一维卷积模块 循环神经网络 交通预测模型 

分 类 号:TP391.9[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象