检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:徐原 王少娜[1] XU Yuan;WANG Shaona(SHengjing Hospital Of China Medical University,Shenyang 110022,China)
出 处:《自动化与仪器仪表》2022年第10期286-291,共6页Automation & Instrumentation
摘 要:针对呼吸体征异常检测率低下的问题,提出基于卷积神经网络模型的呼吸异常报警方法。首先,根据呼吸异常体征的波形特点,提出一维卷积神经网络模型进行呼吸异常种类的检测与识别;然后,为进一步提升呼吸异常的检测准确性和效率,提出多任务学习算法对一维卷积神经网络进行优化,构建MLT模型;最后针对改进前后的模型进行实验。实验结果显示,一维卷积神经网络模型对呼吸异常的检测准确率为97%,其池化层结构可实现异常波形的分类;而提出的MLT模型可将呼吸异常检测的准确率维持在98%左右,相较于传统卷积神经网络,MLT模型的准确率提高了1%,识别效率也提高了10 ms±0.02 ms。由此证明,基于MLT模型的呼吸机体征异常报警方法具有较强的可行性和可靠性。Based on the low detection rate of respiratory signs,the abnormal alarm method based on convolutional neural tube network model is proposed.Firstly,according to the waveform characteristics of respiratory abnormal signs,a one-dimen-sional convolutional neural network model is proposed to conduct the detection and identification of respiratory abnormalities;then,to further improve the detection accuracy and efficiency of respiratory abnormalities,a multi-task learning algorithm is proposed to optimize the 1 D convolutional neural network and construct the MLT model;finally,the model before and after the improvement.The experimental results show that the detection accuracy of one-dimensional convolutional neural network model is 97%,and its pooling layer structure can realize the classification of abnormal waveforms;and the proposed MLT model can maintain the accuracy of respiratory abnormality detection at about 98%.Compared with the traditional convolutional neural net-work,the accuracy of MLT model improves by 1%,and the identification efficiency also improves 10 ms±0.02 ms.This proves that the abnormal ventilator sign alarm method based on MLT model has strong feasibility and reliability.
关 键 词:呼吸体征 MLT模型 一维卷积神经网络 多任务学习 特征提取
分 类 号:TP399[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.30