检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:杨祖元[1] 方思凡 陈禧琛 李珍妮 YANG Zuyuan;FANG Sifan;CHEN Xichen;LI Zhenni(School of Automation,Guangdong University of Technology,Guangzhou 510006,China)
出 处:《医学信息学杂志》2022年第11期55-62,共8页Journal of Medical Informatics
摘 要:基于预训练模型BERT和UniLM MASK提出一个可应用于中医药问题生成的生成式BERT,结合基于标签平滑、对抗扰动和知识蒸馏的多策略机制,以及多模型软投票的集成策略,提高生成式BERT的性能表现和泛化能力,有助于中医药问题生成任务取得更好效果以及中医药文本数据的充分利用。Based on the pre-training model BERT and UniLM MASK,the paper proposes a generative BERT which can be applied to the generation of traditional Chinese medicine questions.Combined with the multi-strategy mechanism based on label smoothing,anti-disturbance and knowledge distillation,and the integrated strategy based on multi-model soft voting,the performance and generalization ability of the generative BERT are further improved.It is helpful for the question generation task of traditional Chinese medicine to achieve better results and to make full use of traditional Chinese medicine text data.
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.4