检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:何伟业 金光[1] HE Weiye;JIN Guang(Faculty of Electrical Engineering and Computer Science,Ningbo University,Ningbo 315211,China)
机构地区:[1]宁波大学信息科学与工程学院,浙江宁波315211
出 处:《宁波大学学报(理工版)》2021年第6期55-60,共6页Journal of Ningbo University:Natural Science and Engineering Edition
基 金:宁波市自然科学基金(202003N4085).
摘 要:计算机视觉的飞速发展,使得采用视觉技术辅助无人船航行成为可能.在无人船巡航过程中,获取船体航向是航行控制的必备基础.特征匹配是无人船相关视觉技术中的重要部分,是目标识别和定位等功能的关键步骤.获取无人船运动姿态的基本步骤是对图像前后帧进行有效的特征提取和匹配.针对水域环境中的图像静态特征提取速度慢、精度低的问题,本文提出一种图像匹配方法以求取无人船的航行姿态角.首先对图像预处理,并对有效区域进行特征提取.其次,设计一种基于描述子相似度的初始特征匹配策略.再其次,筛选特征匹配对,优化模型参数.最后,通过前后帧旋转矩阵计算航行姿态角.实验表明,该方法能有效提取无人船的航行姿态角.The rapid development of computer vision has made it possible to assist the navigation of unmanned surface vessel(USV).In the cruising process of the USV,obtaining the heading of the hull is an indispensable basis for navigation control.Feature matching is an important part of USV-related vision technology,and a key step in functions such as target recognition and positioning.The basic step of obtaining the motion posture of the USV is to extract and match the features of the front and rear frame images effectively.Aiming at addressing the problems slow speed and low accuracy of image static feature extraction in the water environment,the authors propose a navigation attitude angle extraction(NAAE)method.Firstly,the image is preprocessed and features are extracted from the effective area.Secondly,an initial feature matching strategy is designed based on the similarity of descriptors.Thirdly,the feature matching pairs is filtered to optimize model parameters.Finally,the navigation attitude angle is calculated using the rotation matrix.Experiments show that this method can effectively extract the navigational attitude angle of the USV.
分 类 号:TP399[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49