检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:赵恒 安维胜[1] 田怀文[1] Zhao Heng;An Weisheng;Tian Huaiwen(School of Mechanical Engineering,Southwest Jiaotong University,Chengdu 610031,China)
机构地区:[1]西南交通大学机械工程学院
出 处:《计算机应用研究》2019年第6期1911-1916,共6页Application Research of Computers
基 金:四川省科技支撑计划资助项目(2016GZ0194)
摘 要:针对现有算法在复杂背景图像显著目标检测中存在背景被错误凸显的问题,为抑制背景提取更加准确的前景,提出一种结合稀疏重构与能量优化的显著性检测算法。首先将输入图像分割为超像素以去除不必要的细节;然后选取图像边界超像素作为背景模板,利用其作为稀疏字典计算重构误差,并作为超像素初始显著值;最后构造新的能量方程对初始显著值优化,并在优化后对其前景增强生成最终显著图。在包含真值图像的MSRA10K和ECSSD1000数据集上,将提出的算法与其他10种算法进行对比测试,PR曲线图、准确率P、F值的效果优于其他10种算法的结果。实验结果表明,所提算法在复杂背景图像的显著目标检测中,相比于已有的多种算法鲁棒性更好,能够对背景进行有效的抑制,提取显著目标也更加精确。As extant method wrongly highlights backgrounds in salient object detection from complex background images, this paper proposed a new algorithm of saliency detection to suppress background in combination with sparse reconstruction and energy optimization, which extracted foreground more accurate instantaneously. Firstly, it decomposed the input image into super pixel within abstracting unnecessary detail. Then, it selected image boundary super pixels as background templates, which used as sparse dictionaries to calculate reconstruction errors that as super pixels initial saliency. Finally, it introduced the energy equation to optimize the initial saliency, and generated the final saliency map after enhanced the foreground of optimized saliency. It tested the proposed algorithm and other 10 algorithms on MSRA10K and ECSSD1000 dataset with ground truths. The PR curve, precision ( P ) and F -measure( F ) of the proposed algorithm had better performance than other 10 algorithms. The experimental results show that the proposed algorithm is more robust to suppress background effectively, and the extraction of salient object is more precise.
分 类 号:TP391.41[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.38