检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
出 处:《西安交通大学学报》2009年第4期116-120,共5页Journal of Xi'an Jiaotong University
基 金:国家自然科学基金资助项目(50522204)
摘 要:针对交通冲突中车辆的避碰过程无法量化的问题,提出了基于全信息匹配的视频分析法.其核心是利用图像处理方法获得车辆的精确位置,并据此计算出车辆的速度、加速度、航向角、前轮转向角等瞬态信息,进而估算出驾驶人的变速和转向动作.这些测量是在驾驶人无察觉的情况下进行的.利用全信息匹配算法以及亚像素估计等算法,解决了因阳光阴影带来的轮廓边缘变化导致的位置偏移,以及摄像机像素不够的问题,实现了0.1像素级别的车辆位置精确求解.用经过校准的透射变换,消除了拍摄角度对结果的影响.在实验场地,对受试车辆进行了变速和转向实验,结果表明通过高空拍摄图像分析得到的驾驶人动作与实际记录的驾驶人动作完全吻合,且能发现人工没有记录但驾驶人确实承认的轻微转向动作.A method called video analysis based on full information matching (FIM) is proposed to quantify the dynamic process of vehicles in avoiding collisions in traffic conflicts. The key point of the method is to obtain the exact position of the designated vehicle, and to calculate the transient information of the vehicle including its velocity, acceleration, course angles and front-wheel steering angle. Then the actions of the driver can be estimated, such as accelerating, decelerating, and swerving. All these measurements are taken without the drivers' awareness. Full information matching and sub-pixel estimation are used to eliminate the influence of vehicle contour changes on precision of vehicle positioning which are caused by the change of vehicle's shadow. Also the problem of camera's low pixels is solved by using these methods. The precision of vehi- cle positioning can be up to 0. 1 pixel. An adjusted transmission transformation is used to eliminate the influence of shooting angle. A field experiment is carried out to obtain the driver's actions including shifting and swerving. It is proved that the driver's actions achieved by the analysis of high-altitude photography accord with the actual recording actions, and slight swervings that couldn't be observed or recorded artificially are caught as well.
分 类 号:U491[交通运输工程—交通运输规划与管理]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.133.145.211