基于多神经网络与注意力的页岩气实体识别  被引量:1

Shale gas entity recognition based on multi-neural network and attention

在线阅读下载全文

作  者:朱西平 卢星宇 苏作新 高昂 肖丽娟 郭露 ZHU Xiping;LU Xingyu;SU Zuoxin;GAO Ang;XIAO Lijuan;GUO Lu(School of Electrical Engineering and Information,Southwest Petroleum University,Chendu 610500,China)

机构地区:[1]西南石油大学电气信息学院,成都610500

出  处:《中国科技论文》2022年第11期1201-1206,共6页China Sciencepaper

基  金:四川省区域创新合作项目(2022YFQ0102)。

摘  要:针对智能分析系统在页岩气领域的空白问题,基于知识图谱技术进行构建研究,在现有实体识别的基础上引入注意力机制和伪训练样本,提出了一种基于多神经网络与注意力机制的页岩气实体识别方法。首先将字映射为具有上下文语义的密集向量序列,并采用卷积神经网络(convolutional neural networks,CNN)过滤局部语境;然后通过双向长短期记忆(bidirectional long short term memory,BiLSTM)网络对上下文隐藏状态进行捕获;最后利用注意力机制来解决标注不一致的问题,并结合条件随机场(conditional random field,CRF)进一步约束,实现高精度实体分类。在SGAS数据集上进行的实验与测试表明,所提方法的精确度、召回率、F度量值可分别达到99.32%、99.57%、99.44%,得到首个页岩气高精度实体识别模型,验证了所提方法的高效性。Aiming at the blank problem of intelligent analysis system in the field of shale gas,based on the basic construction of knowledge map technology,a shale gas entity recognition method based on multi-neural network and attention mechanism was proposed.This method introduced attention mechanism and pseudo training samples on the basis of existing entity recognition.Firstly,the word was mapped into a dense vector sequence with context semantics,and the local context was filtered by convolutional neural networks(CNN).Then,the context hiding state was captured through the bidirectional long short term memory(BiLSTM)network.Finally,the attention mechanism was used to solve the problem of inconsistent annotation,and combined with the further constraint of conditional random field(CRF),the high-precision entity classification was realized.Experiments and tests on SGAS data set show that the accuracy,recall and F-measure value of the model reach 99.32%,99.57%and 99.44%respectively.The first shale gas high-precision entity recognition model is obtained,which verifies the high efficiency of the proposed method.

关 键 词:文字信息处理 页岩气 注意力机制 伪训练样本 实体识别 

分 类 号:TP391.1[自动化与计算机技术—计算机应用技术] TE37[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象