检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:李鸿光 田妮莉[1] 潘晴[1] LI Hongguang;TIAN Nili;PAN Qing(School of Information Engineering,Guangdong University of Technology,Guangzhou 510006,China)
出 处:《激光杂志》2024年第8期103-109,共7页Laser Journal
基 金:国家自然科学基金(No.61901123)。
摘 要:光响应非均匀性(Photo-Response Non-Uniformity,PRNU)噪声由于其唯一性和稳定性可作为相机的指纹并可用于数字图像的源相机识别。为了提高源相机识别的精度和效率,提出一种基于U形Transformer深度网络(Uformer)的PRNU噪声提取算法。该网络使用了一种基于局部增强窗口(LeWin)的Transformer块,能够在较低计算复杂度下有效提取局部上下文信息。其次,该网络使用了一种多尺度空间偏差形式的多尺度恢复调制器,能够自适应调整Uformer解码器的多层特征,进而更好地提取图像中潜在的PRNU相机指纹。在Dresden数据集上的实验结果表明,所提出的算法在128×128像素,256×256像素和512×512像素的AUC值分别为0.8368,0.9250和0.9720,Kappa值分别为0.9005,0.7447和0.4737,均优于现有方法。Photo-Response Non-Uniformity(PRNU)noise can be used as the fingerprint of the camera and source camera identification of digital images because of its uniqueness and stability.In order to improve the accuracy and efficiency of source camera identification,this paper proposes a PRNU noise extraction algorithm based on Ushaped Transformer deep network(Uformer).The network uses a Transformer block based on Locally-enhanced Window(LeWin),which can effectively extract local context information with low computational complexity.Secondly,the network uses a Multi-Scale Restoration Modulator in the form of multi-scale spatial deviation,which can adaptively adjust the multi-layer features of the Uformer decoder,so as to better extract the potential PRNU camera fingerprints in the image.The experimental results on the Dresden dataset show that the AUC values of the proposed algorithm at 128×128 pixels,256×256 pixels and 512×512 pixels are 0.8368,0.9250 and 0.9720,respectively,and the Kappa values are 0.9005,0.7447 and 0.4737,respectively.They are better than the existing methods.
关 键 词:光响应非均匀性 源相机识别 TRANSFORMER 深度学习 图像处理
分 类 号:TN911[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.200