增强问句和文本交互的答案抽取方法  

Method of answer extraction for enhancing question and text interaction

在线阅读下载全文

作  者:邓涵 DENG Han(Faculty of Information Engineering and Automation,Kunming University of Science and Technology,Kunming 650500,China)

机构地区:[1]昆明理工大学信息工程与自动化学院,云南昆明650500

出  处:《现代电子技术》2024年第6期179-186,共8页Modern Electronics Technique

摘  要:答案抽取对提高问答的质量和性能有着重要的作用,但现有的答案抽取方法存在问句和文本信息交互的问题。结合上下文的答案抽取模型虽然可以从文本中抽取出给定问题的答案,但这种抽取方法并未考虑文本和问句的信息交互。而只有问句和文本数据时,要从文本中获取更加精准的问句答案,可以利用问句和文本之间的语义信息,预测问句与文本实体之间的关联。基于此,使用问句对齐层和多头注意力机制构建一个交互文本和问句之间的信息模型。实验结果表明,相较于BIDAF-INDEPENDENT模型,改进后模型的EM值和F1值分别提高了1.281%和1.296%。The question answering quality and performance are significantly enhanced by means of the answer extraction.The existing answer extraction methods suffer from the problem of interaction between questions and text information.The answer extraction model that combines context can extract the answer to a given question from the text,but this extraction method does not consider the information interaction between the text and the question.When there is only question and text data,to obtain more accurate question answers from the text,the semantic information between the question and the text can be used to predict the association between the question and the text entity.When there is only question and text data,the semantic information between the question and the text can be used to predict the association between the question and the text entity,so as to to obtain more accurate question answers from the text.On this basis,a question alignment layer and multi head attention mechanism are used to construct an information model between interactive text and questions.The experimental results show that,in comparison with the BIDAF INDEPENDENT model,the improved model has an increase of 1.281% in EM value and 1.296% in F1 value,respectively.

关 键 词:答案抽取 问答系统 信息交互 语义信息 深度学习 多头注意力机制 

分 类 号:TN919.65-34[电子电信—通信与信息系统] TP391[电子电信—信息与通信工程]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象