检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:王泽学 万启东 秦杨梅 樊森清 肖泽仪[1] WANG Zexue;WAN Qidong;QIN Yangmei;FAN Senqing;XIAO Zeyi(School of Chemical Engineering,Sichuan University,Chengdu 610065,China)
出 处:《电子科技》2020年第9期44-49,共6页Electronic Science and Technology
基 金:四川省安全生产科技项目(Scaqjgstp 2016011)。
摘 要:针对目前行人易受到车辆撞击,且缺乏主动保护手段的问题,文中设计了一个包括雷达等模块的智能可穿戴设备来保护行人免受车辆的冲击。在此基础上,提出了基于模糊综合评价的安全智能算法,从行人的角度出发,综合考虑将雷达探测的车辆数据、当地道路交通状况、天气、行人状态等多种影响因素作为评价指标。为提高算法的准确性和适应性,提出了基于BP神经网络和多Agent强化学习的方法赋予模糊综合评价的各指标动态权重。仿真验证结果显示,相较于AHP等取权重方法,该预警算法的警报准确率提高了55%以上;相较单Agent强化学习,该方法学习效率提高了近28倍,说明该智能穿戴设备可以对车辆撞击行人进行有效地预测和警告。In view of the current problem that pedestrians are vulnerable to vehicle impact and lack of active protection means,an intelligent wearable device including radar module to protect pedestrians from vehicle impact is proposed in this study.On this basis,a safety intelligent algorithm based on fuzzy comprehensive evaluation is proposed.From the perspective of pedestrians,radar-detected vehicle data,local road traffic conditions,weather,pedestrian status and other factors are considered as evaluation indicators.In order to improve the accuracy and adaptability of the algorithm,a method based on BP neural network and multi-agent reinforcement learning is proposed to give dynamic weights to each index of fuzzy comprehensive evaluation.The simulation results shows that the alarm accuracy of the algorithm is more than 55%higher than that of the weighted method such as AHP,and the learning efficiency of the algorithm is nearly 28 times higher than that of the single agent reinforcement learning,which indicated that intelligent wearing equipment can effectively predict and warn the impact of vehicles on pedestrians.
关 键 词:多AGENT强化学习 危险车辆预警 主动保护 智能穿戴设备 预警算法 模糊综合评价
分 类 号:TN957.524[电子电信—信号与信息处理] TP212.9[电子电信—信息与通信工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30