检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:张生[1] 宋琦 韩韧[1] Sheng Zhang;Qi Song;Ren Han(School of Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology,Shanghai)
出 处:《建模与仿真》2024年第3期2579-2590,共12页Modeling and Simulation
摘 要:实体抽取和关系分类是自然语言处理领域其他任务的基石,二者效果直接或间接影响其他任务的效果。近年来得益于预训练语言模型在自然语言处理应用的巨大成功,实体关系联合抽取发展迅速,而当下使用BERT进行预训练基于Span抽取方式的联合抽取模型在解决实体重叠等问题的同时,依旧存在面对长文本效果变差,模型泛化性较差等问题。本文提出一个基于预训练语言模型Span-BERT进行微调的联合抽取模型,以Span为单位实现实体关系的联合抽取,在抽取训练过程中引入负样本采样策略,并在Span-BERT中进行有效提取,以此来增强模型性能和鲁棒性。实验结果和消融实验表明了该方法的有效性,通过不同程度的噪音化数据集SciERC,证明了本模型具有良好的鲁棒性,同时在ADE、CoNLL2004和SciERC个基准数据集上取得了不错的结果。Entity extraction and relation classification serve as the cornerstones of various tasks in the field of natural language processing,with their effectiveness directly or indirectly impacting the outcomes of other tasks.In recent years,owing to the remarkable success of pre-trained language models in natural language processing applications,joint extraction of entities and relations has rapidly evolved.However,current pre-training methods utilizing BERT based on Span extraction suffer from challenges such as diminished performance on long texts and poor model generalization,despite addressing issues like entity overlap.This paper proposes a joint extraction model fine-tuned on the pre-trained language model Span-BERT.It achieves joint extraction of entities and relations at the Span level,introducing a negative sample sampling strategy during the extraction training process.Effective extraction is carried out within Span-BERT to enhance model performance and robustness.Experimental results and ablation studies demonstrate the effectiveness of this approach.Evaluated on different levels of noisy datasets such as SciERC,the model exhibits robustness.Moreover,it achieves promising results on benchmark datasets,including ADE,CoNLL2004,and SciERC.
分 类 号:TP3[自动化与计算机技术—计算机科学与技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:3.148.171.222