检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:牛晓可[1,2] 黄伊鑫 徐华兴 蒋震阳 NIU Xiaoke;HUANG Yixin;XU Huaxing;JIANG Zhenyang(School of Electrical Engineering,Zhengzhou University,Zhengzhou Henan 450001,China;Henan Key Laboratory of Brain Science and Brain-Computer Interface Technology(Zhengzhou University),Zhengzhou Henan 450001,China)
机构地区:[1]郑州大学电气工程学院,郑州450001 [2]河南省脑科学与脑机接口技术重点实验室(郑州大学),郑州450001
出 处:《计算机应用》2020年第10期3034-3040,共7页journal of Computer Applications
基 金:国家自然科学基金资助项目(11804309)。
摘 要:针对说话人识别易受环境噪声影响的问题,借鉴生物听皮层神经元频谱-时间感受野(STRF)的时空滤波机制,提出一种新的声纹特征提取方法。在该方法中,对基于STRF获得的听觉尺度-速率图进行了二次特征提取,并与传统梅尔倒谱系数(MFCC)进行组合,获得了对环境噪声具有强容忍的声纹特征。采用支持向量机(SVM)作为分类器,对不同信噪比(SNR)语音数据进行测试的结果表明,基于STRF的特征对噪声的鲁棒性普遍高于MFCC系数,但识别正确率较低;组合特征提升了语音识别的正确率,同时对环境噪声具有良好的鲁棒性。该结果说明所提方法在强噪声环境下说话人识别上是有效的。Aiming at the problem that speaker recognition is susceptible to environmental noise,a new voiceprint extraction method was proposed based on the spatial-temporal filtering mechanism of Spectra-Temporal Receptive Field(STRF)of biological auditory cortex neurons.In the method,the quadratic characteristics were extracted from the auditory scale-rate map based on STRF,and the traditional Mel-Frequency Cepstral Coefficient(MFCC)was combined to obtain the voiceprint features with strong tolerance to environmental noise.Using Support Vector Machine(SVM)as feature classifier,the testing results on speech data with different Signal-to-Noise Ratios(SNR)showed that the STRF-based features were more robust to noise than MFCC coefficient,but had lower recognition accuracy;the combined features improved the accuracy of speech recognition and had good robustness to noise.The results verify the effectiveness of the proposed method in speaker recognition under strong noise environment.
关 键 词:听皮层 频谱-时间感受野 梅尔倒谱系数 含噪说话人识别 支持向量机
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.12.34.36