检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:余祥伟 薛东剑[1,2] 陈凤娇 YU Xiangwei;XUE Dongjian;CHEN Fengjiao(College of Earth Sciences,Chengdu University of Technology,Chengdu 610059,China;Key Lab of Information Technology&Application of Land and Resources,Chengdu University of Technology,Chengdu 610059,China)
机构地区:[1]成都理工大学地球科学学院,成都610059 [2]成都理工大学地学空间信息技术国土资源部重点实验室,成都610059
出 处:《遥感信息》2019年第5期120-125,共6页Remote Sensing Information
基 金:国家重点研发计划(2018YFC0706003-3);四川省教育厅重点项目(16ZA0100);国土资源部地学空间信息技术重点实验室开放基金(KLGSIT2013-06)
摘 要:针对实测的SAR图像被噪声广泛淹没、传统滤波方法易模糊边缘等问题,提出了一种新的滤波方法。该方法在图像多尺度的小波分量上,将基于贝叶斯理论对不同系数和不同方向上设置不同阈值得到消噪后的各分量与基于多尺度边缘检测提取的图像边缘等结构所对应小波分量加权融合,重构输出。以真实的SAR影像进行对比实验后,选取图像的均值、等效视数、边缘保持指数、信噪比及特征地物的像素灰度曲线作为评价指标,对不同的滤波方法进行了综合量化评价。实验结果表明,该方法抑制SAR图像斑点噪声的效果较好,对边缘有较好的保持效果。A new method of speckle filtering is presented for the problems that SAR image is widely submerged by noise and the traditional methods can cause the loss of edge information.Based on the Bayesian theory,different components and different directions are set to different thresholds to get the denoised components on multi-scale wavelet components.At the same time,extract wavelet components corresponding to structures such as image edges by multi-scale edge detection.Eventually,fuse these reconstructions to filtering noise.Finally,the new method is compared with existing methods using SAR image,and the different filtering methods are evaluated in quantitative analysis by using the indices that evaluate image quality such as mean,equivalent number,edge retention of the image,edge retention index,etc.The experimental results show that the proposed method has better effect on filtering speckle noise in SAR images,and it can preserve edge information more effectively.
关 键 词:SAR图像 小波变换 贝叶斯阈值 边缘检测 图像去噪
分 类 号:TP751[自动化与计算机技术—检测技术与自动化装置]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.33