检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:郑玉婕 沈兴全[1,2] 周进节 双立[1] 杨启俊 ZHENG Yujie;SHEN Xingquan;ZHOU Jinjie;SHUANG Li;YANG Qijun(School of Mechanical Engineering,North University of China,Taiyuan Shanxi 030051,China;Shanxi Province Deep Hole Processing Engineering Technology Research Center,Taiyuan Shanxi 030051,China)
机构地区:[1]中北大学机械工程学院,山西太原030051 [2]山西省深孔加工工程技术研究中心,山西太原030051
出 处:《机床与液压》2024年第22期218-226,共9页Machine Tool & Hydraulics
基 金:国家自然科学基金面上项目(52075503)。
摘 要:针对普通滚动轴承智能故障诊断方法自适应提取能力弱及故障诊断率低的问题,提出一种基于混合注意力机制模型的滚动轴承故障诊断方法。将一维原始振动信号通过连续小波变换转换为二维特征图像,输入到卷积核注意力机制中以自适应提取故障特征;将提取特征后的图像输入到TDSC模型中,以此量化模型参数、减少每个参数占用内存和对训练好的复杂模型进行压缩,同时提高模型的推理速度和模型训练的准确率;最后,通过2个不同的公开轴承数据集进行实验验证。结果表明:2个数据集故障诊断的最高准确率分别达到了99.99%和99.70%,证明了基于混合注意力机制的轴承故障诊断方法的可行性和优越性。Aiming at the problem of weak adaptive extraction ability and low fault diagnosis rate of ordinary intelligent fault diagnosis method of rolling bearing,a fault diagnosis method of rolling bearing based on mixed attention mechanism model was proposed.The original 1D vibration signal was converted into a 2D feature image by continuous wavelet transform,which was input into the convolutional block attention module to adaptively extract fault features.The extracted feature images were input into the TDSC model to quantify the model parameters,reduce the memory occupied by each parameter,compress the trained complex model,and improve the model reasoning speed and model training accuracy.Finally,two different public bearing data sets were used for experimental verification.The results show that the highest fault diagnosis accuracy of two data sets reaches 99.99%and 99.70%,respectively.The feasibility and superiority of the bearing fault diagnosis method based on mixed attention mechanism are proved.
关 键 词:深度学习 混合注意力机制 卷积神经网络 滚动轴承 故障诊断
分 类 号:TH133.3[机械工程—机械制造及自动化]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.145.177.173