检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
机构地区:[1]贵州大学计算机科学与技术学院,贵阳550025
出 处:《科技导报》2013年第33期15-18,共4页Science & Technology Review
基 金:贵州省科学技术基金项目(黔科合J字[2012]2132);贵阳市科技计划项目(筑科合同[2011101]1-2号);贵州省国际科技合作计划基金资助项目([2009]700109;[2009]700125)
摘 要:由于说话人的语音信号具有时变性、随机性,其特征参数也呈现出高维及相邻帧变化较大等特点。从量子信息处理理论出发,将一帧语音信号视为一个量子态,在传统神经网络的基础上,利用量子逻辑线路构造神经网络,实现说话人语音信号的有效聚类,探索一种基于量子逻辑线路神经网络的说话人识别模型与方法。利用模型固有的大量全局吸引子,可有效降低语音信号处理的时间及复杂度。通过在经典计算机上模拟仿真,并与BP神经网络说话人识别模型进行对比,表明该方法能够加快说话人识别模型的收敛速率,对参数变化具有更好的鲁棒性,且其系统识别率比BP神经网络方法平均提高了3.34%。Under the same spatial and temporal conditions, the quantum computing is superior to the traditional computing. Because a speaker's speech signal features the time-varying property and the randomness, its characteristic parameters also show high-dimensional characters and large changes in adjacent frames. This paper, based on the quantum information processing theory, takes a frame of the speech signal as a quantum state, and uses quantum logic gate circuits to construct the neural network according to the traditional neural network, and obtains an efficient clustering of the speaker's speech signal. A speaker recognition model is built and a method based on the quantum logic circuit neural network is proposed. This model has a large number of global attractors, and the method can use them to effectively reduce the complexity of the speech signal processing. Through simulations on a classical computer and a comparison with the BP neural network speaker recognition model, it is shown that this method not only can accelerate the convergence rate of the model but also has a better robustness with respect to the parameter changes. The system's recognition rate with the method proposed in this paper is 3.34% in average higher than that with the BP neural network method.
分 类 号:TN912.34[电子电信—通信与信息系统]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.7