检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:黄羿博 张秋余[1,2] 袁占亭[1] 杨仲平[2]
机构地区:[1]兰州理工大学电气工程与信息工程学院,甘肃兰州730050 [2]兰州理工大学计算机与通信学院,甘肃兰州730050
出 处:《华中科技大学学报(自然科学版)》2015年第2期124-128,共5页Journal of Huazhong University of Science and Technology(Natural Science Edition)
基 金:国家自然科学基金资助项目(61363078);甘肃省自然科学基金资助项目(1212RJZA006;1310RJYA004)
摘 要:为了提高语音感知哈希算法的鲁棒性和识别小范围篡改定位的能力,利用人类听觉模型提出了一种语音感知哈希算法.该算法基于人类听觉特性,首先对倒谱系数MFCC算法每帧的滤波器数量进行控制,得到每帧语音的梅尔频率倒谱参数;其次对自适应梅尔倒谱系数MFCC参数和语音LPCC系数进行融合,并采用分块方法对特征矩阵进行处理,对特征块进行2DNMF分解运算,降低特征矩阵的复杂度;最后对分解后的系数矩阵进行哈希构造,得到语音感知哈希串,利用哈希匹配实现语音认证.结果表明:该算法可以有效提高哈希认证的鲁棒性,并能够实现语音小范围篡改定位功能.A new kind of hash algorithm of speech perception was proposed with the help of human auditory model in order to improve the robustness of speech perception hash algorithm and the ability to identify tampering with a small scale positioning.Firstly,the method controlled the number of filters of the MFCC(Mel frequency cepstral coefficients)algorithm based on human auditory characteristics,and obtained the Mel frequency cepstral parameters of each frame of speech.Secondly,it fused the AMFCC(adaptive Mel frequency cepstral coefficients)parameter and the LPCC(linear prediction cepstrum coefficients),and dealed with characteristic matrix using blocking.At the same time,it processed 2DNMF(two-dimensional nonnegative matrix factorization)decomposition algorithm to characteristic blocks,so the complexity of the characteristic matrix was reduced.Finally,coefficient matrixes after decomposition were hashed to get the hash string of speech perception.The voice authentication could be realized by hash matching.The results show that the algorithm is able to enhance the robustness of hash authentication and achieve tamper localization of small range.
关 键 词:语音识别 信息安全技术 语音感知哈希 自适应倒谱系数 篡改定位
分 类 号:TP309.2[自动化与计算机技术—计算机系统结构]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.195