检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:尹爱军[1] 吕明阳 杨敏英[2] 陈小敏 YIN Aijun;L Mingyang;YANG Minying;CHEN Xiaomin(College of Mechanical and Vehicle Engineering,Chongqing University,Chongqing 400044,China;Xi’an Satellite Control Center,Xi’an 710000,China)
机构地区:[1]重庆大学机械与运载工程学院,重庆400044 [2]西安卫星测控中心,西安710000
出 处:《振动与冲击》2024年第14期301-307,共7页Journal of Vibration and Shock
基 金:国家自然科学基金(52275518)。
摘 要:如何提高特征提取能力和提取特征空间信息是实现高精度故障诊断的关键。RepVGG等深层卷积神经网络忽视了特征空间信息,而CapsNet由于网络层次较浅导致特征提取能力受限。为解决上述问题,提出了RepVGG与CapsNet融合的轴承故障诊断方法。首先获取不同测点的振动信号,通过格拉姆角差场转换成二维特征图并沿通道方向拼接;然后选取RepVGG网络作为前置卷积层,实现多维振动信号的特征提取与融合;最后由CapsNet提取特征空间信息,实现轴承故障诊断。试验结果表明,RepVGG与CapsNet融合的故障诊断方法具有优良的故障识别效果和抗噪性能。How to improve feature extraction ability and extract the spatial information of features is the key to achieve high-precision fault diagnosis.RepVGG and other deep convolutional neural networks ignore the spatial information of features,while CapsNet has limited feature extraction ability due to its shallow network level.In order to solve the above problems,a bearing fault diagnosis method based on the combined use of RepVGG and CapsNet was proposed.First,the vibration signals at different measuring points were obtained,converted into two-dimensional feature maps by gramian angular difference field,and concatenated along the channel direction.Then,the RepVGG network was selected as the pre-convolutional layer to realize the feature extraction and the fusion of multi-dimensional vibration signals.Finally,by CapsNet,the features’spatial information was extracted to realize bearing fault diagnosis.The experimental results show that the fault diagnosis method based on the combined use of RepVGG and CapsNet has excellent fault recognition performance and noise resistance.
分 类 号:TH133.3[机械工程—机械制造及自动化] TP18[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.85