检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:SHI Wenhua ZHANG Xiongwei ZOU Xia SUN Meng LI Li REN Zhengbing
机构地区:[1]Army Engineering University,Nanjing 210007 [2]Beijing Aeronautical Technology Research Center,Nanjing 210028 [3]First Military Representation Office of Air Force Equipment Department,Changsha 410114
出 处:《Chinese Journal of Acoustics》2021年第1期141-154,共14页声学学报(英文版)
基 金:supported by the National Natural Science Foundation of China (61471394,62071484);the Natural Science Foundation of Jiangsu Province for Excellent Young Scholars (BK20180080)。
摘 要:A time-frequency mask estimation using deep encoder-decoder neural network for speech enhancement is presented.The mask estimation is learned implicitly by the deep encoder-decoder neural network and.jointed with the time-frequency representation of the noisy speech to learn the nonlinear mapping function between the noisy and target speech.The deep encoder-decoder neural network employs convolution and de-nonvolution structure.The convolution encoder makes use of the local perception characteristic of convolution network to model the typical structural features of noisy speech in the time-frequency domain.Speech features are extracted and the influence of background noise is suppressed.At the decoder end,the speech signal is reconstructed from the extracted speech features in the encoder end and the local details are recovered layer by layer.Meanwhile,skip connections are introduced between homologous layers to circumvent the low level details losing problem induced by pooling and down-sampling operations.Experiments are conducted on the TIMIT dataset and the results demonstrate that the proposed method can effectively suppress noise and recover the detailed information of speech.
分 类 号:TN912.35[电子电信—通信与信息系统] TP183[电子电信—信息与通信工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.28