检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:梅礼晔 郭晓鹏 张俊华[1] 郭正红 肖佳 MEI Li-ye;GUO Xiao-peng;ZHANG Jun-hua;GUO Zheng-hong;XIAO Jia(School of Information Science and Engineering,Yunnan University,Kunming 650500,China)
出 处:《云南大学学报(自然科学版)》2019年第1期18-27,共10页Journal of Yunnan University(Natural Sciences Edition)
基 金:国家自然科学基金(61361010)
摘 要:针对传统方法需要人工设定特征和融合准则来完成融合任务,未能充分利用源图像中其他潜在有用信息的缺陷,提出一种基于空间金字塔池化网络的深度学习方法.首先,设计了一种孪生双通道卷积神经网络,并使用金字塔池化代替最大池化,学习多聚焦图像的特征.然后,为了有效训练该网络,采用高斯滤波器合成一个大规模具有金标准的多聚焦数据集.给定一幅多聚焦图像作为输入,训练好的模型可以输出一个指示源图像中聚焦性质的得分图.此外,为了进一步提高融合效果,将得分图进一步分割为二值掩模图,并使用形态学方法对其进行优化.最后,通过在优化的二值掩模图及源图像之间使用点乘运算,将可以得到最终融合图像.实验结果表明,算法在测试集上平均量化指标提高了0.78%.Aiming at the defect of making use of hand-crafted features and fusion criterions to fulfill the fusion task by traditional methods, which does not consider efficiently other potentially useful information in source images, we propose a deep learning method based on the spatial pyramid pooling(SPP). First, we design a Siamese network and replace the average pooling with SPP to learn the features of multi-focus images. Then, to train the network effectively, we synthesize a large-scale multi-focus image dataset with ground truth through a Gaussian filter. Given a pair of multi-focus image as input, the trained model can generate a score map indicating the focus property of source images. Moreover, to further enhance the fusion effects, we segment the score map into a binary mask image, which is refined using morphological technique. Finally, the fused image is gained by employing dot multiplication operation between source images and the refined binary mask image. Experimental results reveal that the average quantitative score on test images achieved by the proposed method is increased by 0.78%.
关 键 词:多聚焦图像融合 卷积神经网络 金字塔池化 形态学 深度学习
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.70