检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张少康[1] 王超[1] 田德艳 张小川 ZHANG Shao-kang;WANG Chao;TIAN De-yan;ZHANG Xiao-chuan(Navy Submarine Academy,Qingdao 266000,China;National Laboratory for Marine Science and Technology,Qingdao 266000,China)
机构地区:[1]海军潜艇学院,山东青岛266000 [2]海洋科学与技术国家实验室,山东青岛266000
出 处:《舰船科学技术》2019年第23期181-185,共5页Ship Science and Technology
摘 要:未来基于水下无人平台的水声目标探测体系要求平台自身具备目标智能化识别能力,而传统水下目标噪声识别方法需要人工提取泛化能力强的特征数据,且识别过程具有较强的人机交互特性,无法满足这一要求。针对这一问题,本文研究一种基于长短时记忆网络(LSTM)的水下目标噪声智能识别方法,借助深度学习自主学习数据特征的能力,应用长短时记忆网络(LSTM)分别对水下目标噪声的时域时间序列数据、频谱数据、梅尔倒谱(MFCC)数据进行深层次特征提取与识别,并使用实际水声目标噪声信号对该方法进行了验证。结果表明,在上述3种输入数据情况下,采用LSTM长短时记忆模型均能有效实现水下目标噪声特征提取与智能识别。In the future,the underwater acoustic target detection system based on the unmanned underwater platform requires the platform itself to have the ability of intelligent target recognition,while the traditional method of underwater target noise recognition needs to manually extract the feature data with strong generalization ability.And the recognition process has a strong human-computer interaction characteristics,which can not meet this requirement.To solve this problem,an intelligent underwater target noise recognition method based on long short-term memory network(LSTM)is studied in this paper.The time domain time series data,frequency spectrum data and Mel frequency cepstrum coefficient(MFCC)data of underwater target noise are extracted and recognized by long short-term memory network(LSTM).The method is verified by underwater acoustic target noise signal.The results show that the long-short term memory network adopted in this paper can effectively achieve underwater target noise feature extraction and intelligent recognition under the above three input data conditions.
关 键 词:深度学习 长短时记忆网络 水下目标辐射噪声 特征提取 智能识别
分 类 号:TP391.4[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.72