摘要
针对无监督聚类方法在应用于话题检测与追踪任务时难以学习到深层语义特征及任务相关特征,K均值聚类、潜在狄利克雷分布(LDA)等方法无法用于增量式聚类的问题,提出基于预训练语言模型的BERT-Single半监督算法。首先使用小规模有标注数据训练预训练语言模型BERT,使BERT模型学习到任务特定的先验知识,生成能够适应话题检测与追踪任务且包含深层语义特征的文本向量;然后利用改进的Single-Pass聚类算法将预训练语言模型学习到的有标签样本信息泛化到无标签数据上,提升模型在话题检测与追踪任务上性能。在构建的数据集上进行实验,结果显示,相较于对比模型,BERT-Single模型精确率至少提升了3个百分点、召回率至少提升了1个百分点、F1值至少提升了3个百分点。BERT-Single模型对于解决话题检测与追踪问题具有较好效果,并能够很好地适应增量式聚类任务。
At present,it is difficult to learn deep semantic features and task-related features when unsupervised clustering applied to topic detection and tracking tasks,and K-means clustering and Latent Dirichlet Allocation(LDA)methods can not be applied to incremental clustering.A semi-supervised BERT-Single algorithm based on pre-trained language model was proposed.Firstly,the pre-trained language model BERT was trained by small-scale labeled data to learn task-specific prior knowledge,and was used to generate text vectors suitable to topic detection and tracking tasks and containing deep semantic features.Then,an improved Single-Pass clustering algorithm was used to generalize the labeled sample information learned from the pretrained language model to the unlabeled data to improve the performance of the model in topic detection and tracking tasks.According to the experimental results on the constructed data set,compared with comparison models,the accuracy of BERT-Single model increased by 3 percentage points,recall increased by 1 percentage points,and F1 value increased by 3 percentage points.The BERT-Single model can solve the problems of topic detection and tracking well,and it can adapt to the incremental clustering tasks well.
作者
侯博元
崔喆
谢欣冉
HOU Boyuan;CUI Zhe;XIE Xinran(Chengdu Institute of Computer Application,Chinese Academy of Sciences,Chengdu Sichuan 610041,China;School of Computer Science and Technology,University of Chinese Academy of Sciences,Beijing 100049,China)
出处
《计算机应用》
CSCD
北大核心
2022年第S01期21-27,共7页
journal of Computer Applications
基金
四川省科技计划项目(2020YFG0009)
四川省重大科技专项(2019ZDZX0005)。
关键词
聚类
半监督学习
话题检测与追踪
预训练语言模型
新闻话题
clustering
semi-supervised learning
Topic Detection and Tracking(TDT)
pre-training language model
news topic