基于注意力机制的问句实体链接  被引量:3

Attention Mechanism Based Question Entity Linking

在线阅读下载全文

作  者:任朝淦 杨燕[1,2] 贾真[1,2] 唐慧佳[1] 喻琇瑛[1] REN Chaogan;YANG Yan;JIA Zhen;TANG Huijia;YU Xiuying(School of Information Science and Technology, Southwest Jiaotong University, Chengdu 611756;Key Laboratory of Cloud Computing and Intelligent Technology of Sichuan Province, Southwest Jiaotong University, Chengdu 611756)

机构地区:[1]西南交通大学信息科学与技术学院,成都611756 [2]西南交通大学四川省云计算与智能技术高校重点实验室,成都611756

出  处:《模式识别与人工智能》2018年第12期1127-1133,共7页Pattern Recognition and Artificial Intelligence

基  金:国家自然科学基金项目(No.61572407);国家科技支撑计划课题(No.2015BAH19F02)资助~~

摘  要:问句实体链接不仅需要大量的数据处理和特征选择工作,而且容易形成错误累积,降低链接效果.针对这种情况,文中提出基于注意力机制的编码器-解码器问句实体链接模型.模型使用双向的长短期记忆网络编码问句,经过注意力机制解码,生成对应的实体指称和消歧信息输出,最后链接到知识库实体.在有关汽车领域车系产品问句和实体数据集上的实验表明,文中模型仅利用较少的上下文信息便可取得良好效果.In question entity linking, a large amount of work in data processing and feature selection is required, cumulative errors are caused easily and the linking effect is reduced. To address the issues, an attention mechanism based encoder-decoder model for entity linking(AMEDEL) is proposed. In this model, long short-term memory network is utilized to encode the questions. Then, entity mentions and disambiguation information are generated as outputs through the decoder process by attention mechanism. Finally, these outputs are linked to the entities in knowledge base. The experiments are conducted on a dataset of questions and entities about products in automotive field. The results show that the proposed model obtains satisfactory results by only employing rare contextual information.

关 键 词:问句实体链接 注意力机制 编码器-解码器 长短期记忆网络 生成模型 

分 类 号:TP391[自动化与计算机技术—计算机应用技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象