检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:赵永涛[1] 陈庆奎[1,2] 方玉玲[2] 赵德玉[2] 姬丽娜[1]
机构地区:[1]上海理工大学光电信息与计算机工程学院,上海200093 [2]上海理工大学管理学院,上海200093
出 处:《计算机应用》2017年第1期134-137,144,共5页journal of Computer Applications
基 金:国家自然科学基金资助项目(61572325;60970012);上海重点科技攻关项目(14511107902);上海市工程中心建设项目(GCZX14014);上海市一流学科建设项目(XTKX2012)~~
摘 要:为了提高机动车驾驶时的安全性,提出了基于计算机视觉的行车安全中车距估计与超车检测方法。首先,使用车辆阴影检测方法确定车辆位置,根据阴影位置和视觉中心点的距离建立车距估计函数;其次,对超车情况使用背景光流建模的方法建立光流估计方程,通过估计光流将行驶中的正常物体与非正常物体分开,从而辨识驾驶途中的超车现象。根据车距和超车情况的检测及时提醒驾驶员注意行车中可能存在的安全隐患。实验结果表明该方法可以较为准确地估计车距、检测超车情况。在统一设备架构(CUDA)下使用图形处理器(GPU)NVIDIA Ge Force GTX680显卡对算法进行加速,可以达到48.9ms/帧的处理速率,基本满足了实时处理的要求。To improve the safety of vehicles while driving, a computer vision-based inter-vehicle distance estimation and warning method was proposed in this paper. First, shadow detection method was applied to detect shadow of cars ahead, and inter-vehicle distance estimation function was built based on the distance between shadow and vision center of a frame. Then, estimation equations for non-threatened background optical flow was buih, and by judging optical flow with the estimation equations, the abnormal objects could be separated from others, thus the overtaking event could be recognized. Based on the inter-vehicle distance and detection of overtaking event, the driver could be timely warned of the potential safety hazard. The experimental results prove that the proposed method can estimate inter-vehicle distance and detect overtaking event accurately. Finally, NVIDIA GeForce GTX680 GPU (Graphic Processing Unit) was used to accelerate the algorithm on Compute Unified Device Architecture (CUDA) platform and achieve the processing speed of 48.9 ms per frame which basically meets the real-time processing demand.
关 键 词:行车安全 车距估计 超车检测 行车监控 安全预警 统一设备架构
分 类 号:TP751.1[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.195