检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:熊强强[1] 赵旭 Xiong Qiangqiang;Zhao Xu(School of Electronics and Information,Nanchang Institute of Technology,Nanchang330013,Jiangxi,China;School of Mechanical and Electrical Engineering,Hainan Vocational University of Science and Technology,Haikou571126,Hainan,China)
机构地区:[1]南昌理工学院电子与信息学院,江西南昌330013 [2]海南科技职业大学机电工程学院,海南海口571126
出 处:《应用激光》2023年第5期94-98,共5页Applied Laser
基 金:教育部面向新工科专业建设的基础课程教学改革项目(E2127);教育部产学合作协同育人项目(202101018022);江西省教育厅科学技术研究项目(GJJ2202718);江西省高等学校教学改革研究项目(JXJG-21-25-7)。
摘 要:为进一步提高室外场景无人机深度测量值的精度,提出了基于激光传感器的双目无人机室外场景视觉深度估计方法。利用回波信号方程分解无人机的回波脉冲信号,解决激光传感器接收回波信号时出现的叠加回波问题。依据激光传感器中的激光成像原理,对无人机的回波信号进行小目标检测成像,解决了因邻近目标波形覆盖难以提取小目标的问题。利用反向卷积神经网络重构图像网络,重新设定Skip的作用域用来无缝拼接提取到的图像特征,以此实现双目无人机室外场景的视觉深度估计。试验结果表明,运用该方法估计室外场景深度时,检测到深度测量值与标准测量曲线相近,视差像素比例可以维持在50%以上,且深度估计的评价均优于对比方法。In order to improve the accuracy of depth measurement of UAV in outdoor scene,a laser sensor based vision depth estimation method for binocular UAV in outdoor scene is proposed.The echo signal equation is used to decompose the echo pulse signal of the UAV,and the problem of superposition echo when the laser sensor receives the echo signal is solved.Based on the principle of laser imaging in laser sensor,the echo signal of UAV is detected and imaged,which solves the problem that it is difficult to extract small target because of the coverage of adjacent target.Inverse convolution neural network is used to reconstruct the image network,and the scope of Skip is reset for seamless stitching of the extracted image features.Experimental results show that the proposed method can estimate the depth of outdoor scene with the ratio of parallax pixels above 50%,and the depth estimation is better than the contrast method.
关 键 词:激光传感器 双目无人机 室外场景 视觉深度估计 激光成像
分 类 号:TN249[电子电信—物理电子学]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30