检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:成爽[1] 陈雷[1] CHENG Shuang;CHEN Lei(Haibin College,Beijing Jiaotong University,Huanghua Hebei 061199,China)
出 处:《激光杂志》2020年第9期160-164,共5页Laser Journal
基 金:河北省高等学校科学技术研究青年基金项目(No.QN2017401);河北省高等教育教学改革研究与实践(No.2018GJJG649)。
摘 要:为了优化光干涉条纹质量,在传统Retinex算法基础之上,提出一种基于改进Retinex算法的光干涉条纹质量提升方法。采用基于HSSIM和残差比阈值的光干涉条纹图像去噪方法,去除光干涉条纹图像中的噪声;针对去噪后的光干涉条纹图像,通过基于多尺度Retinex算法的光干涉条纹图像增强模型,实现光干涉条纹质量多尺度增强。实验结果表示:所提方法提升后,光干涉条纹附近噪声点得以去除,对比度增大,条纹质量得以优化;且和其他方法相比,该方法提升后的光干涉条纹图像的信噪比最高,光干涉条纹失真度最小,去噪性能最好,光干涉条纹清晰度均值高达99.57%,光干涉条纹完整性高达0.98,说明该方法对条纹存在的损耗极小,应用性能极好。In order to optimize the quality of optical interference fringes,a method for improving the quality of optical interference fringes based on traditional Retinex algorithm was proposed.Based on the threshold of HSSIM and residual ratio,the method of optical interference fringe image denoising is used to remove the noise in the image.Aiming at the optical interference fringe image after denoising,the optical interference fringe image enhancement model based on multi-scale Retinex algorithm is used to realize multi-scale enhancement of optical interference fringe quality.The experimental results show that:after the proposed method is improved,the noise points near optical interference fringe can be removed,the contrast can be increased,and the fringe quality can be optimized.Compared with other methods,the improved optical interference fringe image has the highest signal-to-noise ratio,the smallest distortion degree and the best denoising performance,the average clarity of optical interference fringe is 99.57%and the completeness of optical interference fringe is 0.98,indicating that the loss of the fringe is very small and the application performance is excellent.
关 键 词:改进Retinex算法 光干涉条纹 质量提升 噪声干扰
分 类 号:TN929[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.117