检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:YANG Hao-wen SUN Mei-feng 杨浩文;孙美凤(扬州大学信息工程(人工智能)学院,扬州225000;扬州大学广陵学院,扬州225000)
机构地区:[1]College of Information Engineering(Artificial Intelligence),Yangzhou University,Yangzhou 225000,China [2]Guangling College,Yangzhou University,Yangzhou 225000,China
出 处:《印刷与数字媒体技术研究》2024年第5期164-173,共10页Printing and Digital Media Technology Study
基 金:教育部产学合作教育项目(No.220600919293710)。
摘 要:To improve the accuracy of short text matching,a short text matching method with knowledge and structure enhancement for BERT(KS-BERT)was proposed in this study.This method first introduced external knowledge to the input text,and then sent the expanded text to both the context encoder BERT and the structure encoder GAT to capture the contextual relationship features and structural features of the input text.Finally,the match was determined based on the fusion result of the two features.Experiment results based on the public datasets BQ_corpus and LCQMC showed that KS-BERT outperforms advanced models such as ERNIE 2.0.This Study showed that knowledge enhancement and structure enhancement are two effective ways to improve BERT in short text matching.In BQ_corpus,ACC was improved by 0.2%and 0.3%,respectively,while in LCQMC,ACC was improved by 0.4%and 0.9%,respectively.为了提高短文本匹配精度,本研究提出一种对BERT进行知识增强和结构增强的短文本匹配方法(KSBERT)。该方法首先为输入文本引入外部知识;然后将扩充后的文本同时送往上下文编码器BERT和结构编码器GAT,捕获输入文本的上下文关系特征和结构特征;最后,基于两个特征的融合结果进行匹配判定。基于公开数据集BQ_corpus和LCQMC的实验结果表明,KS-BERT的匹配效果超过了ERNIE2.0等先进模型。本研究工作说明,在面向短文本匹配时,知识增强和结构增强是改进BERT的两种有效途径,在BQ_corpus中分别提升了0.2%和0.3%的准确率,在LCQMC中则分别提升了0.4%和0.9%的准确率。
关 键 词:Deep learning Short text matching Graph attention network Knowledge enhancement
分 类 号:TP391[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.222