检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:刘小飞[1,2] 李明杰 LIU Xiaofei;LI Mingjie(School of Information&Intelligence Engineering,Sanya University,Sanya Hainan 572000,China;Academician Workstation of Guoliang Chen,Sanya Hainan 572000,China)
机构地区:[1]三亚学院信息与智能工程学院,海南三亚572000 [2]陈国良院士工作站,海南三亚572000
出 处:《激光杂志》2023年第5期242-246,共5页Laser Journal
基 金:海南省自然科学基金资助项目(No.622RC734);海南省自然科学基金资助项目(No.621RC1077)。
摘 要:为了提高渔船目标无人机低空识别的准确率,提出基于激光视觉传感的渔船目标识别方法。采用阈值分割和角点定位标定渔船目标激光视觉传感图像的方位特征点;提取渔船目标的位置信息、速度信息、加速度信息、运动轨迹信息,建立渔船目标激光视觉传感图像的背景差分检测模型;通过帧动态检测和差分图像聚类,计算相邻目标质心的距离;根据参数估计和像素灰度值检测,结合目标方位估计,实现对渔船目标激光视觉传感图像的定位识别。结果表明,采用该方法能够有效识别渔船目标,目标方位识别准确性达到90.95%以上,提高了渔船目标无人机低空识别准确性。In order to improve the accuracy of low altitude recognition of fishing vessel target UAV,a fishing ves-sel target recognition method based on laser vision sensing is proposed.Threshold segmentation and corner location are used to calibrate the azimuth feature points of the laser vision sensing image of the fishing boat target.The position in-formation,velocity information,acceleration information and motion trajectory information of fishing vessel target are extracted,and the background difference detection model of fishing vessel target laser vision sensing image is estab-lished.The distance between adjacent target centroids is calculated by frame dynamic detection and differential image clustering.According to parameter estimation and pixel gray value detection,combined with target azimuth estimation,the positioning and recognition of fishing boat target laser vision sensing image is realized.The simulation results show that this method can effectively identify the fishing boat target,and the accuracy of target azimuth recognition is more than 90.95%,which improves the accuracy of fishing boat target recognition.
分 类 号:TN911[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.117