增强提示学习的少样本文本分类方法  被引量:3

Enhanced Prompt Learning for Few-shot Text Classification Method

在线阅读下载全文

作  者:李睿凡[1,2,3] 魏志宇 范元涛 叶书勤 张光卫 LI Ruifan;WEI Zhiyu;FAN Yuantao;YE Shuqin;ZHANG Guangwei(School of Artificial Intelligence,Beijing University of Posts and Telecommunications,Beijing 100876;Engineering Research Center of Information Networks,Ministry of Education,Beijing 100876;Key Laboratory of Interactive Technology and Experience System,Ministry of Culture and Tourism,Beijing 100876;School of Computer Science,Beijing University of Posts and Telecommunications,Beijing 100876)

机构地区:[1]北京邮电大学人工智能学院,北京100876 [2]教育部信息网络工程研究中心,北京100876 [3]交互技术与体验系统文化和旅游部重点实验室,北京100876 [4]北京邮电大学计算机学院,北京100876

出  处:《北京大学学报(自然科学版)》2024年第1期1-12,共12页Acta Scientiarum Naturalium Universitatis Pekinensis

基  金:国家自然科学基金(62076032)资助。

摘  要:针对少样本文本分类任务,提出提示学习增强的分类算法(EPL4FTC)。该算法将文本分类任务转换成基于自然语言推理的提示学习形式,在利用预训练语言模型先验知识的基础上实现隐式数据增强,并通过两种粒度的损失进行优化。为捕获下游任务中含有的类别信息,采用三元组损失联合优化方法,并引入掩码语言模型任务作为正则项,提升模型的泛化能力。在公开的4个中文文本和3个英文文本分类数据集上进行实验评估,结果表明EPL4FTC方法的准确度明显优于所对比的基线方法。An enhanced prompt learning method(EPL4FTC)for few-shot text classification task is proposed.This algorithm first converts the text classification task into the form of prompt learning based on natural language inference.Thus,the implicit data enhancement is achieved based on the prior knowledge of pre-training language models and the algorithm is optimized by two losses with different granularities.Moreover,to capture the category information of specific downstream tasks,the triple loss is used for joint optimization.The masked-language model is incorporated as a regularizer to improve the generalization ability.Through the evaluation on four Chinese and three English text classification datasets,the experimental results show that the classification accuracy of the proposed EPL4FTC is significantly better than the other compared baselines.

关 键 词:预训练语言模型 少样本学习 文本分类 提示学习 三元组损失 

分 类 号:TP391.1[自动化与计算机技术—计算机应用技术] TP18[自动化与计算机技术—计算机科学与技术]

 

参考文献:

正在载入数据...

 

二级参考文献:

正在载入数据...

 

耦合文献:

正在载入数据...

 

引证文献:

正在载入数据...

 

二级引证文献:

正在载入数据...

 

同被引文献:

正在载入数据...

 

相关期刊文献:

正在载入数据...

相关的主题
相关的作者对象
相关的机构对象