检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:彭一航 叶武剑 刘怡俊 Peng Yihang;Ye Wujian;Liu Yijun(School of Information Engineering,Guangdong University of Technology,Guangzhou,Guangdong 510006,China)
机构地区:[1]广东工业大学信息工程学院,广东广州510006
出 处:《激光与光电子学进展》2022年第2期273-281,共9页Laser & Optoelectronics Progress
基 金:广东省重点区域研究开发计划(2018B030338001,2018B010107003,2018B010115002);广东省教育厅创新人才项目和广东工业大学青年百人项目(220413548)。
摘 要:针对现有检测算法难以抵抗组合攻击的缺点,提出一种基于混合特征的复制-粘贴篡改识别算法。与传统算法使用固定阈值不同,所提算法采用无阈值相似子块提取方法来选择具有高相关性的子块。同时,为获取更多的局部信息,提出一种自适应子块合成方案以避免子块出现混叠。另外,针对尺度不变特征变换(SIFT)特征无法区分自然相似区域与篡改区域的问题,所提算法结合矩特征的优点,提取合成子块的递进式混合特征以此来降低算法的虚警率。实验结果表明,所提算法在MICC-F2000数据集上的灵敏度(TPR)与F1分别为97.2%与92.9%,在MICC-F220数据集上的TPR与F1为98.2%与95.1%,说明所提算法具有良好的检测能力。Aiming at the disadvantage that the existing detection algorithms are difficult to resist combined attacks,a copy-move forgery recognition algorithm based on mixed features is proposed.Different from the traditional algorithm using fixed threshold,the proposed algorithm uses the similar sub-block extraction method without threshold to select the sub-block with high correlation.At the same time,in order to obtain more local information,an adaptive sub-block synthesis scheme is proposed to avoid sub-block aliasing.In addition,aiming at the problem that scale-invariant feature transform(SIFT)features cannot distinguish natural similar regions from tampered regions,the proposed algorithm combines the advantages of moment features to extract the progressive hybrid features of synthetic sub-blocks,so as to reduce the false alarm rate of the algorithm.The experimental results show that the true positive rate(TPR)and F1 of the proposed algorithm are 97.2% and 92.9% on MICC-F2000 data set and 98.2% and 95.1% on MICC-F220 data set,respectively,indicating that the proposed algorithm has good detection ability.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.33