摘要
本文提出一种新颖的文本分割算法,算法首先将待分割文档划分为若干片段的集合,然后构造全文词汇链分析文中描述的多个子主题,并通过构造片段对子主题的覆盖图将描述相同子主题的相似片段归类.针对段落分割点可能落在片段内部的情况,算法对片段进行二次划分.实验表明:在对文档进行主题分析后,算法能够过滤掉与主题无关的特征对分割结果的干扰;构造的片段对子主题的覆盖图融合了相邻及相间片段的相似性,加大了划分的准确度;对片段进行二次划分使得分割的结果更加合理.
A novel topic segmentation algorithm is proposed in this paper.This algorithm first partitions text into some blocks.After that it constructs whole-length lexical chains to analyze multiple subtopics of this text.By constructing graph which describes blocks covering subtopics,the similar blocks which describe same subtopic can be classified.In order to solve the situations that segmentation points drop inside blocks,it segments blocks again.Experiment results demonstrate that by analyzing topic of text,this algorithm can remove interferences,which are aroused by irrelative features,from segmentation results.By constructing graph which describes blocks covering subtopics,it can mix similarities of adjacent and disconnected blocks together,and increases segmentation precision.The second segmentation makes segmentation results more reasonable.
出处
《电子学报》
EI
CAS
CSCD
北大核心
2009年第2期278-284,共7页
Acta Electronica Sinica
基金
国家自然科学基金重点项目(No.60435020)
国家863高技术研究发展计划项目(No.2006AA01Z197
No.2007AA01Z172)
关键词
主题分析
词汇链
知网
二次划分
topic analysis
lexical chain
HowNet
second segmentation