检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:徐小良[1] 戴乾杰 方启明[1] XU Xiaoliang;DAI Qianjie;FANG Qiming(School of Computer Science and Technology,Hangzhou Dianzi University,Hangzhou 310018,China)
机构地区:[1]杭州电子科技大学计算机学院,浙江杭州310018
出 处:《华中科技大学学报(自然科学版)》2022年第5期33-38,共6页Journal of Huazhong University of Science and Technology(Natural Science Edition)
基 金:浙江省自然科学基金资助项目(LY19F030021);浙江省重点研发计划资助项目(2019C01056,2021C03156,2021C02004);国家重点研发计划资助项目(2017YFC0820503).
摘 要:针对现有上下位关系识别方法未能充分挖掘利用词对共现句中上下位关系语义的问题,提出一种基于依存语义注意力的词对上下位关系识别方法.利用词对共现句最短依存路径的路径向量训练Softmax分类器进行上下位关系识别,引入依存语义注意力机制,构建最短依存路径的注意力权重向量和路径评价函数,更细粒度挖掘和表示不同词和不同路径对上下位关系语义的不同贡献,从而更充分利用精细语义特征实现更准确的上下位关系识别.结果表明:相比HypeNet和NPM等代表性方法,本方法在中文和英文实验数据集上的识别准确率分别可提高2.0%和1.3%,且识别性能更稳定.Aiming to solve the problem of existing methods failing to fully exploit the hypernymy semantics of word-pair cooccurrence sentences,a method for hypernymy recognition was proposed based on the semantic attention mechanism of dependency relationships.The path vectors of the shortest dependency paths of co-occurrence sentences were utilized to train a Softmax classifier to recognize the hypernymy.A semantic attention mechanism of dependency relationships was introduced to more finegrained mine and represent the different contributions of different words and different paths to the hypernymy semantics.The attention weight vectors and path scoring function were constructed to make full use of the fine-grained hypernymy semantic features to improve the hypernymy recognition.Results show that compared with the existing representative methods HypeNet and NPM,the proposed method gets 2.0%and 1.3%higher hypernymy recognition accuracy on typical Chinese and English datasets respectively,as well as better stability.
关 键 词:上下位关系识别 注意力机制 语义特征 词向量 路径向量
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.49