检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张润岩 孟凡荣[1] 周勇[1] 刘兵[1,2] ZHANG Runyan;MENG Fanrong;ZHOU Yong;LIU Bing(School of Computer Science and Technology,China University of Mining and Technology,Xuzhou Jiangsu 221116,China;Institute of Electrics,Chinese Academy of Sciences,Beijing 100080,China)
机构地区:[1]中国矿业大学计算机科学与技术学院,江苏徐州221116 [2]中国科学院电子研究所,北京100080
出 处:《计算机应用》2018年第7期1831-1838,共8页journal of Computer Applications
基 金:国家自然科学基金面上项目(61572505)~~
摘 要:针对语义关系抽取(语义关系分类)中长语句效果不佳和核心词表现力弱的问题,提出了一种基于词级注意力的双向神经图灵机(Ab-NTM)模型。首先,使用神经图灵机(NTM)作为循环神经网络(RNN)的改进,使用长短时记忆(LSTM)网络作为控制器,其互不干扰的存储特性可加强模型在长语句上的记忆能力;然后,构建注意力层组织词级上下文信息,使模型可以加强句中核心词的表现力;最后,输入分类器得到语义关系标签。在SemEval 2010 Task 8公共数据集上的实验表明,该模型获得了86.2%的得分,优于其他方法。Focusing on the problem of poor memory in long sentences and the lack of core words' influence in semantic relation extraction, an Attention based bidirectional Neural Turing Machine (Ab-NTM) model was proposed. Instead of a Recurrent Neural Network (RNN), a Neural Turing Machine (NTM) was used firstly, and a Long Short-Term Memory (LSTM) network was acted as a controller, which contained larger and non-interfering storage, and it could hold longer memories than the RNN. Secondly, an attention layer was used to organize the context information on the word level so that the model could pay attention to the core words in sentences. Finally, the labels were gotten through the classifier. Experiments on the SemEval-2010 Task 8 dataset show that the proposed model outperforms most state-of-the-art methods with an 86.2% F1-score.
关 键 词:自然语言处理 语义关系抽取 循环神经网络 双向神经图灵机 注意力机制
分 类 号:TP183[自动化与计算机技术—控制理论与控制工程]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.79