检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:衡红军[1] 姚若男 Heng Hongjun;Yao Ruonan(Civil Aviation University of China,Tianjin 300300,China)
机构地区:[1]中国民航大学,天津300300
出 处:《计算机应用与软件》2023年第8期214-220,290,共8页Computer Applications and Software
摘 要:已有的跨句多元关系抽取工作将输入文本表示为集成句内和句间依赖关系的复杂文档图,但图中包含的噪声信息会影响关系抽取的效果。针对这种情况,该文利用Graph state LSTM获得上下文信息,再分别利用词级注意力机制或位置感知的注意力机制,自动聚焦在对关系抽取起到决定性作用的关键词上,降低噪声信息的影响。并且比较了两种注意力机制对使用Graph state LSTM进行关系抽取的影响。通过在一个重要的精确医学数据集上进行实验,验证了该文所提出模型的有效性。In the existing cross sentence n-ary relations extraction works,the input text is represented as a complex document graph integrating various intra-sentential and inter-sentential dependencies.The noise information contained in the graph will affect the effect of relation extraction.In view of this,this paper used graph state LSTM to obtain contextual information,and automatically focused on the keywords that played a decisive role in relation extraction by using the word-level attention mechanism or the position aware attention mechanism respectively,to reduce the influence of noise information.This paper compared the influence of two attention mechanisms on relation extraction using graph state LSTM.The validity of the proposed model was validated by conducting experiments on a significant precision medical dataset.
关 键 词:跨句多元关系抽取 注意力机制 Graph state LSTM
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.249