检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:丁一 王中卿[1] DING Yi;WANG Zhongqing(School of Computer Science and Technology,Soochow University,Suzhou,Jiangsu 215006,China)
机构地区:[1]苏州大学计算机科学与技术学院,江苏苏州215006
出 处:《计算机科学》2024年第S01期174-181,共8页Computer Science
摘 要:新闻文本摘要任务旨在从庞大复杂的新闻文本中快速准确地提炼出简明扼要的摘要。基于预训练语言模型对多文档摘要进行研究,重点研究结合预训练任务的具体模型训练方式对模型效果提升的作用,强化多文档之间的信息交流,以生成更全面、更简练的摘要。对于结合预训练任务,提出对基线模型、预训练任务内容、预训练任务数量、预训练任务顺序的对比实验,探索标记了行之有效的预训练任务,总结归纳了强化多文档之间的信息交流的具体方法,精炼提出了简明高效的预训练流程。在公开新闻多文档数据集上进行训练和测试,实验结果表明预训练任务的内容、数量、顺序对ROUGE值都有一定提升,并且整合三者结论提出的特定预训练组合对ROUGE值有明显提升。News summarization aims to quickly and accurately extract a concise summary from the complex news text.This paper studies the multi-document summary based on the pre-training language model,focusing on the effect of model training methods combined with pre-training tasks on improving model performance,and strengthening information exchange between multiple documents to generate more comprehensive and brief summaries.For combined pre-training tasks,this paper conducts comparative experiments on the baseline model,pre-training task content,pre-training task quantity,and pre-training task order,explores and marks effective pre-training tasks,summarizes the specific methods to strengthen the information exchange between documents,and refines and proposes a concise and efficient pre-training process.Through training and testing on the public news multi-document dataset,experimental results show that the content,quantity,and order of the pre-training tasks have a certain improvement on the ROUGE value,and the specific pre-training combination proposed by integrating the conclusions of the three has a significant increase in the ROUGE value.
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.140.250.173