基于深度前编码卷积网络的汉越语音翻译方法  被引量:7

Chinese-vietnamese Speech Translation with Deep Pre-encode Convolutional Neural Network

在线阅读下载全文

作  者:王剑[1,2] 许树理[1,2] 余正涛 王振晗[1,2] 梁仁凤 WANG Jian;XU Shu-li;YU Zheng-tao;WANG Zhen-han;LIANG Ren-feng(Faculty of Information Engineering and Automation,Kunming University of Science and Technology,Kunming 650500,China;Yunnan Key Laboratory of Artificial Intelligence,Kunming University of Science and Technology,Kunming 650500,China)

机构地区:[1]昆明理工大学信息工程与自动化学院,昆明650500 [2]昆明理工大学云南省人工智能重点实验室,昆明650500

出  处:《小型微型计算机系统》2021年第4期736-739,共4页Journal of Chinese Computer Systems

基  金:国家重点研发计划项目(2018YFC0830105,2018YFC0830101,2018YFC0830100)资助。

摘  要:语音翻译是将源语言语音翻译为目标语言文本的过程.传统序列到序列模型应用到语音翻译领域时,模型对于序列长度较为敏感,编码端特征提取和局部依赖建模压力较大.针对这一问题,本文基于Transformer网络构建语音翻译模型,使用深度卷积网络对音频频谱特征进行前编码处理,通过对音频序列进行下采样,对音频频谱中的时频信息进行局部依赖建模和深层特征提取,缓解编码器的建模压力,实现了汉越双语的语音到文本互译.实验结果表明,提出方法取得很好效果,相比基准系统获得了约19%的性能提升.Speech translation translates the speech of the source language from audio into the text of the target language.The problem of applying traditional sequence-to-sequence model to speech translation is that the model is sensitive to sequence length,which results in the pressure of feature extraction and local dependence modeling during encoding stage.To solve this problem,this paper constructs a speech translation model based on Transformer network and uses the deep convolution network to pre-encode the audio spectrum features.By down sampling the audio sequence,local dependency of time-frequency information in the audio spectrum is modeled and then the deep feature is extracted,thus reducing the modeling pressure of the encoder in the model of Chinese-Vietnamese bilingual speech translation.The experimental results show that the proposed method achieves about 19% performance improvement compared with the benchmark systems.

关 键 词:语音翻译 语音识别 机器翻译 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象