基于短时能量和最小相对均方误差准则的神经网络语音水印方法  被引量:1

Neural Network Speech Watermarking Method Based on Short-Term Energy and Least Relative Mean Square Error Criterion

在线阅读下载全文

作  者:郝欢[1] 陈亮[1] 张翼鹏[2] 

机构地区:[1]解放军理工大学通信工程学院,南京210007 [2]南京炮兵学院作战实验中心,南京211132

出  处:《数据采集与处理》2014年第2期254-258,共5页Journal of Data Acquisition and Processing

基  金:国家自然科学基金(61072042)资助项目

摘  要:针对传统最小均方误差(Least mean square error,LMS)和最小二乘准则(Recursive least squares,RLS)的神经网络语音水印的局限性,提出了基于短时能量和最小相对均方误差(Least relative mean square error,LRMS)准则的神经网络语音水印算法。首先在首帧语音中嵌入同步序列,然后求出每帧的短时能量并对大于设定阈值的语音帧进行小波变换,最后利用以LRMS准则构建的神经网络实现水印的嵌入和提取。通过合理设定短时能量阈值,实现了水印容量和鲁棒性的平衡,而采用Levenberg-Marguardt(LM)算法迅速地让网络收敛。理论分析和实验结果表明,与文献[8]相比,本文提出的神经网络方案收敛速度更快,对于噪声、低通滤波、重采样和重量化等攻击有更强的鲁棒性,性能平均提高了5%。Abstract: In order to overcome the weakness of least mean square error (LMS) and the recur- sive least squares(RLS), a new neural network speech watermarking method based on short- term energy and least relative mean square error(LRMS) is proposed. Firstly, a synchroniza- tion sequence is embedded into the first frame of the speech. In addition, the short-term energy of each frame is calculated and discrete wavelet transform(DWT) is performed for the speech frame larger than the threshold. Finally, the watermark is embedded and extracted via the trained LRMS based neural network. The balance of the watermarking capacity and robustness is achieved by setting a reasonable short-term energy threshold and the network converges fast by Levenberg-Marguardt(LM) algorithm. The theoretical analysis and the experimental re- suits show that, compared with reference [8], the improved neural network scheme converges faster and gets better robustness against attacks such as additive noise, low-pass filtering, re- sampling, re-quantifying, et al. Moreover, the performance achieves 5% increase on average.

关 键 词:短时能量 最小相对均方误差 小波变换 Levenberg—Marguardt算法 

分 类 号:TN392[电子电信—物理电子学]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象