检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:朱永红 戴晨雨 李蔓华 ZHU Yonghong;DAI Chenyu;LI Manhua(School of Mechanical and Electronic Engineering,Jingdezhen Ceramic University,Jingdezhen 333403,Jiangxi,China)
机构地区:[1]景德镇陶瓷大学机械电子工程学院,江西景德镇333403
出 处:《陶瓷学报》2024年第1期180-190,共11页Journal of Ceramics
基 金:国家自然科学基金(62063010);江西省重大科技研发专项(20214ABC28W003)。
摘 要:目前,陶瓷辊道窑烧制温度主要通过热电偶检测。由于热电偶容易老化,造成温度检测精度逐渐变低,以致影响陶瓷制品烧成质量。针对此问题,提出将基于深度学习火焰图像特征识别与热电偶点检测数据融合代替热电偶的辊道窑温度智能检测方法。该方法是针对陶瓷辊道窑火焰图像采用一种基于移位窗口视觉自注意力机制的多尺度特征提取网络,利用卷积神经网络和Transformer分支的局部和远程特征来保留更多图像信息,获得较准确的火焰图像特征,并与热电偶点检测数据融合,从而实现对陶瓷辊道窑温度的精确检测。该多尺度特征提取网络模型首先是采用一种基于多层Transformer的自编码器网络提取出其中的浅层和多尺度深层特征,其次将多个特征融合到Transformer和卷积神经网络中使其获得足够的能力去捕捉特征信息,最后将热电偶点检测获得的数据输入前面的网络中,利用特征级信息融合来是实现火焰图像特征与关键点检测温度数据的融合。实验结果表明,本文提出的融合网路模型比基于卷积神经网络融合方法特征识别平均准确率提高了1.75%,平均误差产生的减少了2.67%,其大多数指标上优于卷积神经网络分支或Transformer分支图像融合,因而本文的方法有效可行。At present,the firing temperature of ceramic roller kiln is mainly detected by thermocouples.Due to the easy aging of thermocouples,the temperature detection accuracy of thermocouples gradually becomes low so as to affect the firing quality of ceramic products.For this problem,an intelligent temperature detection method of fusing flame image feature recognition based on deep learning with thermocouple point detection data instead of thermocouple.This method is that a multi-scale feature extraction network based on the shift window visual self-attention mechanism is adopted for the flame image of ceramic roller kiln,convolutional neural network and local and remote features of transformer branch is used for retaining more image information to obtain more accurate flame image features which are fused with thermocouple point detection data,thus,the temperature of ceramic roller kiln can be accurately detected.In the multi-scale feature extraction network model,firstly,an auto-encoder network based on multi-layer transformer is used for extracting shallow and multi-scale deep features,and then multiple features are fused into transformer and convolutional neural network to make it be able to capture feature information,finally,the data obtained from the thermocouple point detection is input into the front network to achieve the fusion of the flame image features and the key point detection temperature data by the feature level information fusion.Experimental results show that the fusion network model proposed in this paper is 1.75% higher in average feature recognition accuracy and 2.67% lower in average error generation than the convolutional neural network fusion method,which is superior to the convolutional neural network branch or transformer branch image fusion in most indicators.Hence,the method proposed in the paper is effective and feasible.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:52.14.244.195