检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:郭应时[1] 张瑞宾 陈元华[2] 李天明[2] 蒋春燕[2] Guo Yingshi;Zhang Ruibin;Chen Yuanhua;Li Tianming;Jiang Chunyan(Chang’an University,Xi’an 710064;Guilin University of Aerospace Technology,Guilin 541004)
机构地区:[1]长安大学,西安710064 [2]桂林航天工业学院,桂林541004
出 处:《汽车技术》2022年第3期21-27,共7页Automobile Technology
基 金:国家重点研发计划项目(2019YFB1600500);国家自然科学基金项目(51775053,51908054);广西自然科学基金项目(2020GXNSFAA159071);广西高校中青年教师基础能力提升项目(2019KY0819,2020KY21014,2021KY0795);广西壮族自治区科协资助青年科技工作者专项课题(桂科协[2020]ZC-30)。
摘 要:针对传统算法无法满足复杂交通场景下无人驾驶车辆对周围运动车辆轨迹预测需求的问题,提出一种基于观测数据潜在特征与双向长短期记忆(BiLSTM)网络的车辆轨迹预测方法。首先利用一维卷积神经网络(1DCNN)提取由传感器所获取的车辆运行状态观测数据的潜在特征,然后将以序列方式构造的具有时空关系的特征向量作为BiLSTM网络的输入数据,最后利用车辆运行数据对所构建的1DCNN-BiLSTM模型进行训练,形成期望的输入输出映射关系,从而预测车辆的行驶轨迹。试验结果表明,1DCNN-BiLSTM相比传统方法能更加准确有效地处理序列数据,对车辆运行轨迹预测的效果也具有较高的鲁棒性。In order to solve the problem that the traditional algorithm can not meet the requirements of unmanned vehicle trajectory prediction in complex traffic scenes,this article proposes a vehicle trajectory prediction method based on the potential characteristics of observation data and Bi-directional Long Short-Term Memory(BiLSTM)network.First 1Dimensional Convolutional Neural Network(1DCNN)is used to extract the potential features of the observation data of vehicle motion state obtained by sensors,then the feature vectors with spatiotemporal relationship constructed in sequence are used as the input data of BiLSTM network,and finally the constructed 1DCNN-BiLSTM model is trained with the vehicle motion state data to form the expected input/output mapping relationship,thus to predict the vehicle trajectory.Test results show that 1DCNN-BiLSTM can process the sequence data more accurately and effectively than the traditional method,and it also has high robustness for the prediction of vehicle trajectory.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.229