检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:刘彤 宋嘉乐 张子林 舒瀚达 LIU Tong;SONG Jia-le;ZHANG Zi-lin;SHU Han-da(North China Research Institute of Electro-Optics,Beijing 100015,China)
出 处:《激光与红外》2024年第2期274-280,共7页Laser & Infrared
摘 要:随着航空无人系统技术的迅猛发展,分布式机载图像拼接技术已成为备受瞩目的研究领域。本文针对分布式机载图像拼接中存在的视差大和空间几何变换关系复杂等问题,在APAP图像拼接算法基础上提出了一种改进算法。该算法采用变形处理、线性化单应平滑外推至全局性变换以及网格划分法等方法,有效消除了模糊重影、减少边缘处投影失真,提高算法运行效率。在多个场景下的实验中,该算法表现出更小的对齐误差和更高的图像质量指标,包括均方根误差、峰值信噪比、结构相似度和图像熵等。在进行大规模图像拼接时,该改进算法能够实现154张图像的大规模拼接,得到10 k×10 k的高分辨率全景图像,拼接耗时为138 s。因此,该改进算法具有重要的实际应用价值,可用于分布式机载图像拼接的实际应用中。With the rapid development of aviation unmanned system technology,distributed airborne image stitching technology has become a high-profile research field.In this paper,an improved algorithm based on the APAP image stitching algorithm is proposed to solve the problems of large parallax and complex spatial geometric transformation in distributed airborne image stitching.The algorithm employs deformation processing,linear homography smooth extrapolation to global transformation,and mesh division method,which effectively eliminates blurred ghosting,reduces projection distortion at edges,and improves the operating efficiency of the algorithm.In experiments under multiple scenarios,the proposed algorithm exhibits smaller alignment errors and higher image quality metrics,including root mean square error,peak signal-to-noise ratio,structural similarity,and image entropy,among others.Moreover,when performing large-scale image stitching,the improved algorithm can achieve large-scale stitching of 154 images to obtain a 10 k×10 k high-resolution panoramic image,with a stitching time of 138 s.Therefore,the improved algorithm has important practical application value and can be used in the practical application of distributed airborne image stitching.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7