检索规则说明:AND代表“并且”;OR代表“或者”;NOT代表“不包含”;(注意必须大写,运算符两边需空一格)
检 索 范 例 :范例一: (K=图书馆学 OR K=情报学) AND A=范并思 范例二:J=计算机应用与软件 AND (U=C++ OR U=Basic) NOT M=Visual
作 者:侯博元 崔喆[1] 谢欣冉 HOU Boyuan;CUI Zhe;XIE Xinran(Chengdu Institute of Computer Application,Chinese Academy of Sciences,Chengdu Sichuan 610041,China;School of Computer Science and Technology,University of Chinese Academy of Sciences,Beijing 100049,China)
机构地区:[1]中国科学院成都计算机应用研究所,成都610041 [2]中国科学院大学计算机科学与技术学院,北京100049
出 处:《计算机应用》2022年第S01期21-27,共7页journal of Computer Applications
基 金:四川省科技计划项目(2020YFG0009);四川省重大科技专项(2019ZDZX0005)。
摘 要:针对无监督聚类方法在应用于话题检测与追踪任务时难以学习到深层语义特征及任务相关特征,K均值聚类、潜在狄利克雷分布(LDA)等方法无法用于增量式聚类的问题,提出基于预训练语言模型的BERT-Single半监督算法。首先使用小规模有标注数据训练预训练语言模型BERT,使BERT模型学习到任务特定的先验知识,生成能够适应话题检测与追踪任务且包含深层语义特征的文本向量;然后利用改进的Single-Pass聚类算法将预训练语言模型学习到的有标签样本信息泛化到无标签数据上,提升模型在话题检测与追踪任务上性能。在构建的数据集上进行实验,结果显示,相较于对比模型,BERT-Single模型精确率至少提升了3个百分点、召回率至少提升了1个百分点、F1值至少提升了3个百分点。BERT-Single模型对于解决话题检测与追踪问题具有较好效果,并能够很好地适应增量式聚类任务。At present,it is difficult to learn deep semantic features and task-related features when unsupervised clustering applied to topic detection and tracking tasks,and K-means clustering and Latent Dirichlet Allocation(LDA)methods can not be applied to incremental clustering.A semi-supervised BERT-Single algorithm based on pre-trained language model was proposed.Firstly,the pre-trained language model BERT was trained by small-scale labeled data to learn task-specific prior knowledge,and was used to generate text vectors suitable to topic detection and tracking tasks and containing deep semantic features.Then,an improved Single-Pass clustering algorithm was used to generalize the labeled sample information learned from the pretrained language model to the unlabeled data to improve the performance of the model in topic detection and tracking tasks.According to the experimental results on the constructed data set,compared with comparison models,the accuracy of BERT-Single model increased by 3 percentage points,recall increased by 1 percentage points,and F1 value increased by 3 percentage points.The BERT-Single model can solve the problems of topic detection and tracking well,and it can adapt to the incremental clustering tasks well.
关 键 词:聚类 半监督学习 话题检测与追踪 预训练语言模型 新闻话题
分 类 号:TP391.1[自动化与计算机技术—计算机应用技术]
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在载入数据...
正在链接到云南高校图书馆文献保障联盟下载...
云南高校图书馆联盟文献共享服务平台 版权所有©
您的IP:216.73.216.147