检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:陈旭阳[1] 贺昱曜[1] 宗瑞良[2] 李宝奇[1] 赵耀华 CHEN Xuyang;HE Yuyao;ZONG Ruiliang;LI Baoqi;ZHAO Yaohua(School of Marine Science and Technology,Northwestern Polytechnical University,Xi′an 710072,China;School of Electronic and Information,Northwestern Polytechnical University,Xi′an 710072,China)
机构地区:[1]西北工业大学航海学院,陕西西安710072 [2]西北工业大学电子信息学院,陕西西安710072
出 处:《西北工业大学学报》2019年第3期471-478,共8页Journal of Northwestern Polytechnical University
基 金:国家自然科学基金(61271143)资助
摘 要:针对水下目标成像时光线折射所造成的图像失真问题,以及现有图像转换算法因忽略光线二次折射所造成的转换误差,提出一种基于光线折射模型的水下图像转换转算法。该算法首先获取水下图像的像素点信息,通过映射关系计算得到像素点在等效空气图像中的对应坐标信息,从而获得水下目标的等效空气图像。实验结果显示,文中所提算法较之现有图像转换算法,u方向图像转换误差均值由2.2895降为1.2133,降低47.01%,v方向图像转换误差均值由3.2525降为1.5263,降低53.07%。同时,测距误差均值由58.83mm降为28.88mm,降低50.91%。Aiming at the problem of image distortion caused by light refraction during underwater imaging, and the conversion error caused by existing image conversion algorithms due to neglecting the secondary refraction of light, an underwater image conversion algorithm based on the light refraction model is proposed in this paper. The algorithm firstly obtains the pixel information of the underwater image, then calculates the corresponding coordinate information of the pixel points in the equivalent air image through the mapping relationship, and finally obtains the equivalent air image through image interpolation. Experimental results shows, compared with the existing image conversion algorithms, the proposed algorithm reduces the average error of u direction from 2.289 5 to 1.213 3, which is a decrease of 47.01%. The average error of v direction is reduced from 3.252 5 to 1.526 3, which is a decrease of 53.07%. At the same time, the mean value of ranging error was reduced from 58.83 mm to 28.88 mm, a decrease of 50.91%.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.249